[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 7554 1726853145.46084: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Qi7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 7554 1726853145.46922: Added group all to inventory 7554 1726853145.46924: Added group ungrouped to inventory 7554 1726853145.46928: Group all now contains ungrouped 7554 1726853145.46931: Examining possible inventory source: /tmp/network-iHm/inventory.yml 7554 1726853145.67857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 7554 1726853145.67927: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 7554 1726853145.67955: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 7554 1726853145.68022: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 7554 1726853145.68118: Loaded config def from plugin (inventory/script) 7554 1726853145.68120: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 7554 1726853145.68167: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 7554 1726853145.68257: Loaded config def from plugin (inventory/yaml) 7554 1726853145.68260: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 7554 1726853145.68373: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 7554 1726853145.69197: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 7554 1726853145.69201: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 7554 1726853145.69204: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 7554 1726853145.69210: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 7554 1726853145.69214: Loading data from /tmp/network-iHm/inventory.yml 7554 1726853145.69301: /tmp/network-iHm/inventory.yml was not parsable by auto 7554 1726853145.69600: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 7554 1726853145.69701: Loading data from /tmp/network-iHm/inventory.yml 7554 1726853145.70254: group all already in inventory 7554 1726853145.70261: set inventory_file for managed_node1 7554 1726853145.70266: set inventory_dir for managed_node1 7554 1726853145.70267: Added host managed_node1 to inventory 7554 1726853145.70269: Added host managed_node1 to group all 7554 1726853145.70270: set ansible_host for managed_node1 7554 1726853145.70274: set ansible_ssh_extra_args for managed_node1 7554 1726853145.70278: set inventory_file for managed_node2 7554 1726853145.70281: set inventory_dir for managed_node2 7554 1726853145.70282: Added host managed_node2 to inventory 7554 1726853145.70284: Added host managed_node2 to group all 7554 1726853145.70285: set ansible_host for managed_node2 7554 1726853145.70286: set ansible_ssh_extra_args for managed_node2 7554 1726853145.70288: set inventory_file for managed_node3 7554 1726853145.70291: set inventory_dir for managed_node3 7554 1726853145.70291: Added host managed_node3 to inventory 7554 1726853145.70293: Added host managed_node3 to group all 7554 1726853145.70294: set ansible_host for managed_node3 7554 1726853145.70294: set ansible_ssh_extra_args for managed_node3 7554 1726853145.70297: Reconcile groups and hosts in inventory. 7554 1726853145.70301: Group ungrouped now contains managed_node1 7554 1726853145.70303: Group ungrouped now contains managed_node2 7554 1726853145.70304: Group ungrouped now contains managed_node3 7554 1726853145.70387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 7554 1726853145.70569: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 7554 1726853145.70618: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 7554 1726853145.70649: Loaded config def from plugin (vars/host_group_vars) 7554 1726853145.70651: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 7554 1726853145.70668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 7554 1726853145.70678: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 7554 1726853145.70720: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 7554 1726853145.71067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853145.71164: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 7554 1726853145.71213: Loaded config def from plugin (connection/local) 7554 1726853145.71216: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 7554 1726853145.72058: Loaded config def from plugin (connection/paramiko_ssh) 7554 1726853145.72061: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 7554 1726853145.73776: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 7554 1726853145.73815: Loaded config def from plugin (connection/psrp) 7554 1726853145.73818: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 7554 1726853145.75260: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 7554 1726853145.75303: Loaded config def from plugin (connection/ssh) 7554 1726853145.75306: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 7554 1726853145.77672: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 7554 1726853145.77711: Loaded config def from plugin (connection/winrm) 7554 1726853145.77714: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 7554 1726853145.77745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 7554 1726853145.77811: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 7554 1726853145.77884: Loaded config def from plugin (shell/cmd) 7554 1726853145.77886: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 7554 1726853145.77913: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 7554 1726853145.77982: Loaded config def from plugin (shell/powershell) 7554 1726853145.77984: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 7554 1726853145.78034: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 7554 1726853145.78221: Loaded config def from plugin (shell/sh) 7554 1726853145.78224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 7554 1726853145.78262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 7554 1726853145.78385: Loaded config def from plugin (become/runas) 7554 1726853145.78388: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 7554 1726853145.78574: Loaded config def from plugin (become/su) 7554 1726853145.78576: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 7554 1726853145.78732: Loaded config def from plugin (become/sudo) 7554 1726853145.78734: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 7554 1726853145.78766: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml 7554 1726853145.79076: in VariableManager get_vars() 7554 1726853145.79096: done with get_vars() 7554 1726853145.79221: trying /usr/local/lib/python3.12/site-packages/ansible/modules 7554 1726853145.82093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 7554 1726853145.82206: in VariableManager get_vars() 7554 1726853145.82210: done with get_vars() 7554 1726853145.82213: variable 'playbook_dir' from source: magic vars 7554 1726853145.82214: variable 'ansible_playbook_python' from source: magic vars 7554 1726853145.82215: variable 'ansible_config_file' from source: magic vars 7554 1726853145.82215: variable 'groups' from source: magic vars 7554 1726853145.82216: variable 'omit' from source: magic vars 7554 1726853145.82217: variable 'ansible_version' from source: magic vars 7554 1726853145.82218: variable 'ansible_check_mode' from source: magic vars 7554 1726853145.82218: variable 'ansible_diff_mode' from source: magic vars 7554 1726853145.82219: variable 'ansible_forks' from source: magic vars 7554 1726853145.82220: variable 'ansible_inventory_sources' from source: magic vars 7554 1726853145.82221: variable 'ansible_skip_tags' from source: magic vars 7554 1726853145.82221: variable 'ansible_limit' from source: magic vars 7554 1726853145.82222: variable 'ansible_run_tags' from source: magic vars 7554 1726853145.82224: variable 'ansible_verbosity' from source: magic vars 7554 1726853145.82260: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml 7554 1726853145.82963: in VariableManager get_vars() 7554 1726853145.82980: done with get_vars() 7554 1726853145.83018: in VariableManager get_vars() 7554 1726853145.83032: done with get_vars() 7554 1726853145.83069: in VariableManager get_vars() 7554 1726853145.83083: done with get_vars() 7554 1726853145.83216: in VariableManager get_vars() 7554 1726853145.83230: done with get_vars() 7554 1726853145.83236: variable 'omit' from source: magic vars 7554 1726853145.83257: variable 'omit' from source: magic vars 7554 1726853145.83291: in VariableManager get_vars() 7554 1726853145.83301: done with get_vars() 7554 1726853145.83345: in VariableManager get_vars() 7554 1726853145.83360: done with get_vars() 7554 1726853145.83397: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7554 1726853145.83609: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7554 1726853145.83736: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7554 1726853145.84789: in VariableManager get_vars() 7554 1726853145.84808: done with get_vars() 7554 1726853145.85521: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 7554 1726853145.85660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7554 1726853145.87435: in VariableManager get_vars() 7554 1726853145.87455: done with get_vars() 7554 1726853145.87494: in VariableManager get_vars() 7554 1726853145.87587: done with get_vars() 7554 1726853145.88521: in VariableManager get_vars() 7554 1726853145.88539: done with get_vars() 7554 1726853145.88543: variable 'omit' from source: magic vars 7554 1726853145.88558: variable 'omit' from source: magic vars 7554 1726853145.88591: in VariableManager get_vars() 7554 1726853145.88606: done with get_vars() 7554 1726853145.88626: in VariableManager get_vars() 7554 1726853145.88642: done with get_vars() 7554 1726853145.88733: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7554 1726853145.88859: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7554 1726853145.88938: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7554 1726853145.90988: in VariableManager get_vars() 7554 1726853145.91029: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7554 1726853145.93194: in VariableManager get_vars() 7554 1726853145.93214: done with get_vars() 7554 1726853145.93322: in VariableManager get_vars() 7554 1726853145.93341: done with get_vars() 7554 1726853145.93398: in VariableManager get_vars() 7554 1726853145.93417: done with get_vars() 7554 1726853145.93422: variable 'omit' from source: magic vars 7554 1726853145.93433: variable 'omit' from source: magic vars 7554 1726853145.93466: in VariableManager get_vars() 7554 1726853145.93483: done with get_vars() 7554 1726853145.93502: in VariableManager get_vars() 7554 1726853145.93518: done with get_vars() 7554 1726853145.93543: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7554 1726853145.93712: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7554 1726853145.93792: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7554 1726853145.94209: in VariableManager get_vars() 7554 1726853145.94232: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7554 1726853145.96632: in VariableManager get_vars() 7554 1726853145.96656: done with get_vars() 7554 1726853145.96900: in VariableManager get_vars() 7554 1726853145.96921: done with get_vars() 7554 1726853145.97422: in VariableManager get_vars() 7554 1726853145.97462: done with get_vars() 7554 1726853145.97467: variable 'omit' from source: magic vars 7554 1726853145.97480: variable 'omit' from source: magic vars 7554 1726853145.97508: in VariableManager get_vars() 7554 1726853145.97526: done with get_vars() 7554 1726853145.97543: in VariableManager get_vars() 7554 1726853145.97565: done with get_vars() 7554 1726853145.97593: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7554 1726853145.97705: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7554 1726853145.97784: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7554 1726853145.98180: in VariableManager get_vars() 7554 1726853145.98204: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7554 1726853146.00185: in VariableManager get_vars() 7554 1726853146.00209: done with get_vars() 7554 1726853146.00248: in VariableManager get_vars() 7554 1726853146.00272: done with get_vars() 7554 1726853146.00324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 7554 1726853146.00338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 7554 1726853146.00592: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 7554 1726853146.00745: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 7554 1726853146.00750: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 7554 1726853146.00782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 7554 1726853146.00806: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 7554 1726853146.00973: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 7554 1726853146.01063: Loaded config def from plugin (callback/default) 7554 1726853146.01066: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 7554 1726853146.02967: Loaded config def from plugin (callback/junit) 7554 1726853146.02970: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 7554 1726853146.03015: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 7554 1726853146.03086: Loaded config def from plugin (callback/minimal) 7554 1726853146.03099: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 7554 1726853146.03137: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 7554 1726853146.03202: Loaded config def from plugin (callback/tree) 7554 1726853146.03204: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 7554 1726853146.03317: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 7554 1726853146.03320: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_auto_gateway_nm.yml ******************************************** 2 plays in /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml 7554 1726853146.03348: in VariableManager get_vars() 7554 1726853146.03359: done with get_vars() 7554 1726853146.03365: in VariableManager get_vars() 7554 1726853146.03375: done with get_vars() 7554 1726853146.03379: variable 'omit' from source: magic vars 7554 1726853146.03413: in VariableManager get_vars() 7554 1726853146.03428: done with get_vars() 7554 1726853146.03449: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_auto_gateway.yml' with nm as provider] ***** 7554 1726853146.03977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 7554 1726853146.04050: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 7554 1726853146.04081: getting the remaining hosts for this loop 7554 1726853146.04082: done getting the remaining hosts for this loop 7554 1726853146.04085: getting the next task for host managed_node3 7554 1726853146.04088: done getting next task for host managed_node3 7554 1726853146.04090: ^ task is: TASK: Gathering Facts 7554 1726853146.04092: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853146.04094: getting variables 7554 1726853146.04095: in VariableManager get_vars() 7554 1726853146.04103: Calling all_inventory to load vars for managed_node3 7554 1726853146.04105: Calling groups_inventory to load vars for managed_node3 7554 1726853146.04108: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853146.04119: Calling all_plugins_play to load vars for managed_node3 7554 1726853146.04130: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853146.04134: Calling groups_plugins_play to load vars for managed_node3 7554 1726853146.04169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853146.04225: done with get_vars() 7554 1726853146.04231: done getting variables 7554 1726853146.04316: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:6 Friday 20 September 2024 13:25:46 -0400 (0:00:00.010) 0:00:00.010 ****** 7554 1726853146.04339: entering _queue_task() for managed_node3/gather_facts 7554 1726853146.04341: Creating lock for gather_facts 7554 1726853146.04679: worker is 1 (out of 1 available) 7554 1726853146.04689: exiting _queue_task() for managed_node3/gather_facts 7554 1726853146.04700: done queuing things up, now waiting for results queue to drain 7554 1726853146.04702: waiting for pending results... 7554 1726853146.04991: running TaskExecutor() for managed_node3/TASK: Gathering Facts 7554 1726853146.05077: in run() - task 02083763-bbaf-bdc3-98b6-000000000155 7554 1726853146.05090: variable 'ansible_search_path' from source: unknown 7554 1726853146.05178: calling self._execute() 7554 1726853146.05198: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853146.05209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853146.05225: variable 'omit' from source: magic vars 7554 1726853146.05330: variable 'omit' from source: magic vars 7554 1726853146.05367: variable 'omit' from source: magic vars 7554 1726853146.05407: variable 'omit' from source: magic vars 7554 1726853146.05462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853146.05544: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853146.05550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853146.05553: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853146.05567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853146.05603: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853146.05612: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853146.05620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853146.05728: Set connection var ansible_shell_executable to /bin/sh 7554 1726853146.05763: Set connection var ansible_pipelining to False 7554 1726853146.05765: Set connection var ansible_shell_type to sh 7554 1726853146.05768: Set connection var ansible_connection to ssh 7554 1726853146.05769: Set connection var ansible_timeout to 10 7554 1726853146.05781: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853146.05802: variable 'ansible_shell_executable' from source: unknown 7554 1726853146.05872: variable 'ansible_connection' from source: unknown 7554 1726853146.05876: variable 'ansible_module_compression' from source: unknown 7554 1726853146.05955: variable 'ansible_shell_type' from source: unknown 7554 1726853146.05959: variable 'ansible_shell_executable' from source: unknown 7554 1726853146.05962: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853146.05964: variable 'ansible_pipelining' from source: unknown 7554 1726853146.05966: variable 'ansible_timeout' from source: unknown 7554 1726853146.05968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853146.06178: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853146.06181: variable 'omit' from source: magic vars 7554 1726853146.06183: starting attempt loop 7554 1726853146.06186: running the handler 7554 1726853146.06190: variable 'ansible_facts' from source: unknown 7554 1726853146.06192: _low_level_execute_command(): starting 7554 1726853146.06194: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853146.07044: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853146.07162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853146.07186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853146.07357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853146.07418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853146.09137: stdout chunk (state=3): >>>/root <<< 7554 1726853146.09303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853146.09306: stdout chunk (state=3): >>><<< 7554 1726853146.09309: stderr chunk (state=3): >>><<< 7554 1726853146.09329: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853146.09376: _low_level_execute_command(): starting 7554 1726853146.09380: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853146.0933511-7589-155231366754809 `" && echo ansible-tmp-1726853146.0933511-7589-155231366754809="` echo /root/.ansible/tmp/ansible-tmp-1726853146.0933511-7589-155231366754809 `" ) && sleep 0' 7554 1726853146.09937: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853146.09952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853146.09969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853146.10029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853146.10282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853146.10323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853146.10432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853146.12409: stdout chunk (state=3): >>>ansible-tmp-1726853146.0933511-7589-155231366754809=/root/.ansible/tmp/ansible-tmp-1726853146.0933511-7589-155231366754809 <<< 7554 1726853146.12577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853146.12580: stdout chunk (state=3): >>><<< 7554 1726853146.12583: stderr chunk (state=3): >>><<< 7554 1726853146.12590: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853146.0933511-7589-155231366754809=/root/.ansible/tmp/ansible-tmp-1726853146.0933511-7589-155231366754809 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853146.12667: variable 'ansible_module_compression' from source: unknown 7554 1726853146.12852: ANSIBALLZ: Using generic lock for ansible.legacy.setup 7554 1726853146.12855: ANSIBALLZ: Acquiring lock 7554 1726853146.12857: ANSIBALLZ: Lock acquired: 140257826526304 7554 1726853146.12859: ANSIBALLZ: Creating module 7554 1726853146.49933: ANSIBALLZ: Writing module into payload 7554 1726853146.50113: ANSIBALLZ: Writing module 7554 1726853146.50140: ANSIBALLZ: Renaming module 7554 1726853146.50156: ANSIBALLZ: Done creating module 7554 1726853146.50211: variable 'ansible_facts' from source: unknown 7554 1726853146.50225: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853146.50239: _low_level_execute_command(): starting 7554 1726853146.50252: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 7554 1726853146.50936: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853146.50953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853146.50981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853146.51000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853146.51116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853146.51127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853146.51144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853146.51250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853146.52957: stdout chunk (state=3): >>>PLATFORM <<< 7554 1726853146.53047: stdout chunk (state=3): >>>Linux <<< 7554 1726853146.53070: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 <<< 7554 1726853146.53083: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 7554 1726853146.53263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853146.53266: stdout chunk (state=3): >>><<< 7554 1726853146.53269: stderr chunk (state=3): >>><<< 7554 1726853146.53413: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853146.53419 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 7554 1726853146.53423: _low_level_execute_command(): starting 7554 1726853146.53426: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 7554 1726853146.53567: Sending initial data 7554 1726853146.53570: Sent initial data (1181 bytes) 7554 1726853146.54085: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853146.54144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853146.54162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853146.54188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853146.54443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853146.57969: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 7554 1726853146.58390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853146.58427: stderr chunk (state=3): >>><<< 7554 1726853146.58436: stdout chunk (state=3): >>><<< 7554 1726853146.58675: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853146.58679: variable 'ansible_facts' from source: unknown 7554 1726853146.58682: variable 'ansible_facts' from source: unknown 7554 1726853146.58684: variable 'ansible_module_compression' from source: unknown 7554 1726853146.58686: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 7554 1726853146.58688: variable 'ansible_facts' from source: unknown 7554 1726853146.58845: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853146.0933511-7589-155231366754809/AnsiballZ_setup.py 7554 1726853146.59078: Sending initial data 7554 1726853146.59088: Sent initial data (152 bytes) 7554 1726853146.59690: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853146.59703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853146.59795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853146.61443: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7554 1726853146.61457: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 7554 1726853146.61468: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 7554 1726853146.61480: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 7554 1726853146.61490: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 7554 1726853146.61500: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 7554 1726853146.61511: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 7554 1726853146.61521: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 7554 1726853146.61530: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7554 1726853146.61539: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 <<< 7554 1726853146.61552: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853146.61639: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853146.61711: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmptz9xt2p_ /root/.ansible/tmp/ansible-tmp-1726853146.0933511-7589-155231366754809/AnsiballZ_setup.py <<< 7554 1726853146.61716: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853146.0933511-7589-155231366754809/AnsiballZ_setup.py" <<< 7554 1726853146.61780: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmptz9xt2p_" to remote "/root/.ansible/tmp/ansible-tmp-1726853146.0933511-7589-155231366754809/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853146.0933511-7589-155231366754809/AnsiballZ_setup.py" <<< 7554 1726853146.63549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853146.63553: stdout chunk (state=3): >>><<< 7554 1726853146.63555: stderr chunk (state=3): >>><<< 7554 1726853146.63557: done transferring module to remote 7554 1726853146.63561: _low_level_execute_command(): starting 7554 1726853146.63568: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853146.0933511-7589-155231366754809/ /root/.ansible/tmp/ansible-tmp-1726853146.0933511-7589-155231366754809/AnsiballZ_setup.py && sleep 0' 7554 1726853146.64589: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853146.64789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853146.64808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853146.64894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853146.66730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853146.66772: stderr chunk (state=3): >>><<< 7554 1726853146.66877: stdout chunk (state=3): >>><<< 7554 1726853146.66881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853146.66884: _low_level_execute_command(): starting 7554 1726853146.66886: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853146.0933511-7589-155231366754809/AnsiballZ_setup.py && sleep 0' 7554 1726853146.67845: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853146.67850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853146.67852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853146.67854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853146.68013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853146.70249: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 7554 1726853146.70258: stdout chunk (state=3): >>>import _imp # builtin <<< 7554 1726853146.70305: stdout chunk (state=3): >>>import '_thread' # <<< 7554 1726853146.70309: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 7554 1726853146.70364: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 7554 1726853146.70407: stdout chunk (state=3): >>>import 'posix' # <<< 7554 1726853146.70443: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 7554 1726853146.70463: stdout chunk (state=3): >>>import 'time' # <<< 7554 1726853146.70483: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 7554 1726853146.70539: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 7554 1726853146.70554: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 7554 1726853146.70574: stdout chunk (state=3): >>>import 'codecs' # <<< 7554 1726853146.70599: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 7554 1726853146.70674: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec451bc4d0> <<< 7554 1726853146.70677: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4518bb00> <<< 7554 1726853146.70698: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec451bea50> <<< 7554 1726853146.70717: stdout chunk (state=3): >>>import '_signal' # <<< 7554 1726853146.70787: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 7554 1726853146.70790: stdout chunk (state=3): >>>import 'io' # <<< 7554 1726853146.70793: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 7554 1726853146.70901: stdout chunk (state=3): >>>import '_collections_abc' # <<< 7554 1726853146.70904: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 7554 1726853146.70975: stdout chunk (state=3): >>>import 'os' # <<< 7554 1726853146.70982: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 7554 1726853146.71033: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 7554 1726853146.71040: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 7554 1726853146.71058: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 7554 1726853146.71076: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec451cd130> <<< 7554 1726853146.71129: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 7554 1726853146.71162: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec451cdfa0> <<< 7554 1726853146.71165: stdout chunk (state=3): >>>import 'site' # <<< 7554 1726853146.71194: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 7554 1726853146.71578: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 7554 1726853146.71607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 7554 1726853146.71629: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853146.71657: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 7554 1726853146.71684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 7554 1726853146.71700: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 7554 1726853146.71730: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 7554 1726853146.71772: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fabdd0> <<< 7554 1726853146.71804: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 7554 1726853146.71821: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fabfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 7554 1726853146.71866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 7554 1726853146.71889: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 7554 1726853146.71940: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853146.71943: stdout chunk (state=3): >>>import 'itertools' # <<< 7554 1726853146.72017: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fe3800> <<< 7554 1726853146.72021: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fe3e90> <<< 7554 1726853146.72036: stdout chunk (state=3): >>>import '_collections' # <<< 7554 1726853146.72082: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fc3aa0> <<< 7554 1726853146.72098: stdout chunk (state=3): >>>import '_functools' # <<< 7554 1726853146.72127: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fc11c0> <<< 7554 1726853146.72206: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fa8f80> <<< 7554 1726853146.72257: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 7554 1726853146.72277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 7554 1726853146.72313: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 7554 1726853146.72347: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 7554 1726853146.72402: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45003770> <<< 7554 1726853146.72412: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45002390> <<< 7554 1726853146.72432: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fc2090> <<< 7554 1726853146.72443: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45000ad0> <<< 7554 1726853146.72490: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 7554 1726853146.72509: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45038800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fa8200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 7554 1726853146.72525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 7554 1726853146.72569: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45038cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45038b60> <<< 7554 1726853146.72611: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45038ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fa6d20> <<< 7554 1726853146.72661: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853146.72694: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 7554 1726853146.72720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 7554 1726853146.72735: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45039550> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45039220> import 'importlib.machinery' # <<< 7554 1726853146.72770: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 7554 1726853146.72799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4503a450> <<< 7554 1726853146.72815: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 7554 1726853146.72832: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 7554 1726853146.72882: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 7554 1726853146.72906: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45050680> <<< 7554 1726853146.72935: stdout chunk (state=3): >>>import 'errno' # <<< 7554 1726853146.72970: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45051d60> <<< 7554 1726853146.72975: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 7554 1726853146.73019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 7554 1726853146.73023: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 7554 1726853146.73038: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45052c00> <<< 7554 1726853146.73087: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45053260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45052150> <<< 7554 1726853146.73109: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 7554 1726853146.73152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 7554 1726853146.73176: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45053ce0> <<< 7554 1726853146.73201: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45053410> <<< 7554 1726853146.73224: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4503a4b0> <<< 7554 1726853146.73251: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 7554 1726853146.73291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 7554 1726853146.73305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 7554 1726853146.73332: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44d47bc0> <<< 7554 1726853146.73379: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 7554 1726853146.73409: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44d70710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44d70470> <<< 7554 1726853146.73448: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44d70740> <<< 7554 1726853146.73463: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 7554 1726853146.73537: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853146.73663: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44d71070> <<< 7554 1726853146.73824: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44d71a60> <<< 7554 1726853146.73832: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44d70920> <<< 7554 1726853146.73902: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44d45d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 7554 1726853146.73907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 7554 1726853146.73910: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 7554 1726853146.73953: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44d72de0> <<< 7554 1726853146.73984: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44d70ef0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4503aba0> <<< 7554 1726853146.73992: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 7554 1726853146.74061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853146.74081: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 7554 1726853146.74106: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 7554 1726853146.74130: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44d9b170> <<< 7554 1726853146.74205: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 7554 1726853146.74241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 7554 1726853146.74249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 7554 1726853146.74325: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44dbf530> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 7554 1726853146.74357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 7554 1726853146.74420: stdout chunk (state=3): >>>import 'ntpath' # <<< 7554 1726853146.74442: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44e202f0> <<< 7554 1726853146.74472: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 7554 1726853146.74499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 7554 1726853146.74514: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 7554 1726853146.74555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 7554 1726853146.74635: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44e22a20> <<< 7554 1726853146.74721: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44e203e0> <<< 7554 1726853146.74787: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44de5310> <<< 7554 1726853146.74792: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec447213a0> <<< 7554 1726853146.74805: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44dbe330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44d73d40> <<< 7554 1726853146.75175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fec44dbe690> <<< 7554 1726853146.75544: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_4k73f3a2/ansible_ansible.legacy.setup_payload.zip' <<< 7554 1726853146.75552: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.75758: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.75794: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 7554 1726853146.75799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 7554 1726853146.75880: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 7554 1726853146.76002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 7554 1726853146.76006: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec447870e0> import '_typing' # <<< 7554 1726853146.76343: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44765fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44765130><<< 7554 1726853146.76361: stdout chunk (state=3): >>> # zipimport: zlib available<<< 7554 1726853146.76408: stdout chunk (state=3): >>> import 'ansible' # <<< 7554 1726853146.76435: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853146.76469: stdout chunk (state=3): >>> # zipimport: zlib available<<< 7554 1726853146.76473: stdout chunk (state=3): >>> <<< 7554 1726853146.76507: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 7554 1726853146.76543: stdout chunk (state=3): >>> # zipimport: zlib available <<< 7554 1726853146.78462: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.80413: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44784f80> <<< 7554 1726853146.80459: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853146.80492: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 7554 1726853146.80566: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec447b6a80> <<< 7554 1726853146.80726: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec447b6810> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec447b6120> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 7554 1726853146.80739: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec447b68a0> <<< 7554 1726853146.80742: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44787b00> <<< 7554 1726853146.80756: stdout chunk (state=3): >>>import 'atexit' # <<< 7554 1726853146.80779: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853146.80784: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec447b77a0> <<< 7554 1726853146.80817: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853146.80820: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec447b79b0> <<< 7554 1726853146.80976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 7554 1726853146.80988: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec447b7ec0> <<< 7554 1726853146.80996: stdout chunk (state=3): >>>import 'pwd' # <<< 7554 1726853146.81028: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 7554 1726853146.81067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 7554 1726853146.81108: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44621c70> <<< 7554 1726853146.81203: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44623890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 7554 1726853146.81249: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44624290> <<< 7554 1726853146.81283: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 7554 1726853146.81306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 7554 1726853146.81327: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44625400> <<< 7554 1726853146.81344: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 7554 1726853146.81395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 7554 1726853146.81418: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 7554 1726853146.81466: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44627ec0> <<< 7554 1726853146.81515: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44fa6e10> <<< 7554 1726853146.81547: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44626180> <<< 7554 1726853146.81574: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 7554 1726853146.81620: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 7554 1726853146.81623: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 7554 1726853146.81778: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 7554 1726853146.81781: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 7554 1726853146.81808: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4462fc80> import '_tokenize' # <<< 7554 1726853146.81878: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4462e750> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4462e4e0> <<< 7554 1726853146.81908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 7554 1726853146.81973: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4462ea20> <<< 7554 1726853146.82001: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44626690> <<< 7554 1726853146.82045: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44673ef0> <<< 7554 1726853146.82093: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44674470> <<< 7554 1726853146.82117: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 7554 1726853146.82121: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 7554 1726853146.82190: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 7554 1726853146.82194: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44675b50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44675910> <<< 7554 1726853146.82220: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 7554 1726853146.82238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 7554 1726853146.82296: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec446780e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44676240> <<< 7554 1726853146.82309: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 7554 1726853146.82369: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853146.82403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 7554 1726853146.82438: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4467b800> <<< 7554 1726853146.82562: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec446781d0> <<< 7554 1726853146.82638: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4467c5f0> <<< 7554 1726853146.82658: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4467c9e0> <<< 7554 1726853146.82728: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4467cad0> <<< 7554 1726853146.82753: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44674230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 7554 1726853146.82779: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 7554 1726853146.82792: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 7554 1726853146.82838: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853146.82852: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec445081a0> <<< 7554 1726853146.83007: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853146.83018: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44509580> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4467e930> <<< 7554 1726853146.83078: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4467fce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4467e570> <<< 7554 1726853146.83105: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7554 1726853146.83115: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 7554 1726853146.83202: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.83306: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.83354: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available <<< 7554 1726853146.83359: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 7554 1726853146.83484: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.83599: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.84158: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.84710: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 7554 1726853146.84758: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 7554 1726853146.84761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853146.84810: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4450d670> <<< 7554 1726853146.84902: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 7554 1726853146.84930: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4450e540> <<< 7554 1726853146.84961: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec445080e0> <<< 7554 1726853146.84996: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 7554 1726853146.85024: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.85041: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 7554 1726853146.85191: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.85338: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 7554 1726853146.85365: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4450e510> # zipimport: zlib available <<< 7554 1726853146.85876: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.86260: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.86426: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.86429: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 7554 1726853146.86450: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.86484: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 7554 1726853146.86533: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.86646: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.86681: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 7554 1726853146.86695: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.86729: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.86766: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 7554 1726853146.86785: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.87003: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.87234: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 7554 1726853146.87318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 7554 1726853146.87386: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4450f5f0> <<< 7554 1726853146.87414: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.87468: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.87555: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 7554 1726853146.87587: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 7554 1726853146.87591: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.87635: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.87680: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 7554 1726853146.87691: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.87724: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.87769: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.87821: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.87893: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 7554 1726853146.87936: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853146.88033: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4451a090> <<< 7554 1726853146.88087: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44515880> <<< 7554 1726853146.88113: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 7554 1726853146.88191: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.88249: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.88275: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.88327: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853146.88359: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 7554 1726853146.88382: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 7554 1726853146.88394: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 7554 1726853146.88453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 7554 1726853146.88487: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 7554 1726853146.88543: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44602840> <<< 7554 1726853146.88586: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec447d6540> <<< 7554 1726853146.88676: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44519dc0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4467cc80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 7554 1726853146.88697: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.88745: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.88751: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 7554 1726853146.88823: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 7554 1726853146.88833: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.88848: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 7554 1726853146.88905: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.89001: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.89100: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7554 1726853146.89173: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7554 1726853146.89184: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 7554 1726853146.89611: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 7554 1726853146.89614: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.89841: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853146.89896: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 7554 1726853146.89919: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 7554 1726853146.89975: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec445ae1e0> <<< 7554 1726853146.90008: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 7554 1726853146.90101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 7554 1726853146.90104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4416ffb0> <<< 7554 1726853146.90308: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec441745c0> <<< 7554 1726853146.90312: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4459a9c0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec445aed50> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec445ac8c0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec445ac530> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 7554 1726853146.90394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 7554 1726853146.90420: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 7554 1726853146.90533: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44177380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44176c30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44176e10> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44176090> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 7554 1726853146.90704: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 7554 1726853146.90755: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44177500> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 7554 1726853146.90874: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec441da000> <<< 7554 1726853146.90878: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44177f50> <<< 7554 1726853146.90880: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec445ac590> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 7554 1726853146.90964: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.90986: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.91020: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 7554 1726853146.91063: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.91076: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.91137: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 7554 1726853146.91169: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 7554 1726853146.91239: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 7554 1726853146.91243: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.91350: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.91385: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available <<< 7554 1726853146.91431: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 7554 1726853146.91442: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.91499: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.91558: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.91618: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.91680: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 7554 1726853146.91694: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.92353: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.92630: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 7554 1726853146.92666: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.92882: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 7554 1726853146.92916: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7554 1726853146.92954: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 7554 1726853146.92976: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.93098: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.93138: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 7554 1726853146.93181: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.93233: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 7554 1726853146.93270: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.93310: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 7554 1726853146.93367: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.93455: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.93574: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 7554 1726853146.93615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec441dbb60> <<< 7554 1726853146.93644: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 7554 1726853146.93689: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 7554 1726853146.93876: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec441dac90> import 'ansible.module_utils.facts.system.local' # <<< 7554 1726853146.93987: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7554 1726853146.94083: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 7554 1726853146.94103: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.94227: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.94365: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 7554 1726853146.94368: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.94459: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.94574: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 7554 1726853146.94693: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 7554 1726853146.94775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 7554 1726853146.94866: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853146.95001: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44212330> <<< 7554 1726853146.95266: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44202f00> <<< 7554 1726853146.95284: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 7554 1726853146.95453: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # <<< 7554 1726853146.95467: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.95581: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.95779: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.96012: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.96046: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 7554 1726853146.96093: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.96138: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 7554 1726853146.96189: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.96202: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.96256: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 7554 1726853146.96365: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4422a270> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4422a1e0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 7554 1726853146.96596: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 7554 1726853146.96599: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.96742: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 7554 1726853146.96745: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.96888: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.96945: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.97000: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.97027: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 7554 1726853146.97065: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 7554 1726853146.97115: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7554 1726853146.97387: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.97494: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 7554 1726853146.97502: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 7554 1726853146.97508: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.97696: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.97864: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 7554 1726853146.97918: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.97924: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.98046: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.98923: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853146.99766: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 7554 1726853146.99787: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 7554 1726853146.99817: stdout chunk (state=3): >>> # zipimport: zlib available <<< 7554 1726853147.00006: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853147.00157: stdout chunk (state=3): >>> <<< 7554 1726853147.00183: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 7554 1726853147.00207: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.00367: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.00525: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 7554 1726853147.00556: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853147.00562: stdout chunk (state=3): >>> <<< 7554 1726853147.00808: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.01056: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 7554 1726853147.01083: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.01109: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.01128: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 7554 1726853147.01159: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.01233: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.01314: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 7554 1726853147.01620: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7554 1726853147.01956: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.02310: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 7554 1726853147.02316: stdout chunk (state=3): >>> <<< 7554 1726853147.02342: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.02398: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853147.02400: stdout chunk (state=3): >>> <<< 7554 1726853147.02458: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 7554 1726853147.02481: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853147.02483: stdout chunk (state=3): >>> <<< 7554 1726853147.02519: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.02577: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 7554 1726853147.02675: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.02959: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 7554 1726853147.03001: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 7554 1726853147.03017: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.03094: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.03168: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 7554 1726853147.03178: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.03513: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.03785: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 7554 1726853147.03924: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 7554 1726853147.03959: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.04026: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 7554 1726853147.04057: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.04280: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 7554 1726853147.04339: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 7554 1726853147.04583: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 7554 1726853147.04664: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.04718: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.04823: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 7554 1726853147.04835: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 7554 1726853147.04875: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.04984: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 7554 1726853147.05321: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.05593: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 7554 1726853147.05680: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.05711: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 7554 1726853147.05751: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.05852: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # <<< 7554 1726853147.05907: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.06043: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.06176: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 7554 1726853147.06212: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available<<< 7554 1726853147.06215: stdout chunk (state=3): >>> <<< 7554 1726853147.06559: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 7554 1726853147.06610: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.07191: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 7554 1726853147.07241: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec43fbeb70> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec43fbc110> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec43fbcd70> <<< 7554 1726853147.30728: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44004fb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44004bf0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44006360> <<< 7554 1726853147.30732: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44005e20> <<< 7554 1726853147.30982: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 7554 1726853147.57182: stdout chunk (state=3): >>> <<< 7554 1726853147.57478: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 3063, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 468, "free": 3063}, "nocache": {"free": 3337, "used": 194}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21<<< 7554 1726853147.57707: stdout chunk (state=3): >>>f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 291, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261815541760, "block_size": 4096, "block_total": 65519099, "block_available": 63919810, "block_used": 1599289, "inode_total": 131070960, "inode_available": 131029198, "inode_used": 41762, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "25", "second": "47", "epoch": "1726853147", "epoch_int": "1726853147", "date": "2024-09-20", "time": "13:25:47", "iso8601_micro": "2024-09-20T17:25:47.512642Z", "iso8601": "2024-09-20T17:25:47Z", "iso8601_basic": "20240920T132547512642", "iso8601_basic_short": "20240920T132547", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.1435546875, "5m": 0.189453125, "15m": 0.09423828125}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 7554 1726853147.58282: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 7554 1726853147.58289: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv<<< 7554 1726853147.58297: stdout chunk (state=3): >>> # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type<<< 7554 1726853147.58432: stdout chunk (state=3): >>> # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath <<< 7554 1726853147.58435: stdout chunk (state=3): >>># cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse <<< 7554 1726853147.58494: stdout chunk (state=3): >>># destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__<<< 7554 1726853147.58518: stdout chunk (state=3): >>> # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale<<< 7554 1726853147.58597: stdout chunk (state=3): >>> # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd<<< 7554 1726853147.58600: stdout chunk (state=3): >>> # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128<<< 7554 1726853147.58603: stdout chunk (state=3): >>> # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat<<< 7554 1726853147.58619: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian<<< 7554 1726853147.58664: stdout chunk (state=3): >>> # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing <<< 7554 1726853147.58696: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec<<< 7554 1726853147.58773: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info<<< 7554 1726853147.58824: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns<<< 7554 1726853147.58908: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user<<< 7554 1726853147.58933: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos<<< 7554 1726853147.58963: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace<<< 7554 1726853147.59012: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr<<< 7554 1726853147.59177: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 7554 1726853147.59757: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 7554 1726853147.59766: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 7554 1726853147.59827: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma <<< 7554 1726853147.59831: stdout chunk (state=3): >>># destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 7554 1726853147.59883: stdout chunk (state=3): >>># destroy ntpath <<< 7554 1726853147.59915: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__<<< 7554 1726853147.59926: stdout chunk (state=3): >>> # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder<<< 7554 1726853147.59984: stdout chunk (state=3): >>> # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 7554 1726853147.60025: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 7554 1726853147.60079: stdout chunk (state=3): >>># destroy selinux # destroy shutil<<< 7554 1726853147.60116: stdout chunk (state=3): >>> # destroy distro # destroy distro.distro<<< 7554 1726853147.60164: stdout chunk (state=3): >>> # destroy argparse # destroy logging <<< 7554 1726853147.60202: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors <<< 7554 1726853147.60245: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing<<< 7554 1726853147.60264: stdout chunk (state=3): >>> # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle<<< 7554 1726853147.60327: stdout chunk (state=3): >>> # destroy _pickle # destroy queue<<< 7554 1726853147.60332: stdout chunk (state=3): >>> # destroy _heapq # destroy _queue<<< 7554 1726853147.60357: stdout chunk (state=3): >>> # destroy multiprocessing.reduction<<< 7554 1726853147.60395: stdout chunk (state=3): >>> # destroy selectors<<< 7554 1726853147.60437: stdout chunk (state=3): >>> # destroy shlex # destroy fcntl<<< 7554 1726853147.60440: stdout chunk (state=3): >>> # destroy datetime <<< 7554 1726853147.60461: stdout chunk (state=3): >>># destroy subprocess<<< 7554 1726853147.60500: stdout chunk (state=3): >>> # destroy base64 # destroy _ssl <<< 7554 1726853147.60538: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 7554 1726853147.60583: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios # destroy json<<< 7554 1726853147.60632: stdout chunk (state=3): >>> # destroy socket<<< 7554 1726853147.60635: stdout chunk (state=3): >>> <<< 7554 1726853147.60683: stdout chunk (state=3): >>># destroy struct <<< 7554 1726853147.60686: stdout chunk (state=3): >>># destroy glob # destroy fnmatch<<< 7554 1726853147.60702: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection <<< 7554 1726853147.60730: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection<<< 7554 1726853147.60756: stdout chunk (state=3): >>> <<< 7554 1726853147.60827: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser<<< 7554 1726853147.60876: stdout chunk (state=3): >>> # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian<<< 7554 1726853147.60879: stdout chunk (state=3): >>> # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon<<< 7554 1726853147.60903: stdout chunk (state=3): >>> # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache<<< 7554 1726853147.60940: stdout chunk (state=3): >>> # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize<<< 7554 1726853147.60958: stdout chunk (state=3): >>> <<< 7554 1726853147.60991: stdout chunk (state=3): >>># cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib<<< 7554 1726853147.61025: stdout chunk (state=3): >>> # cleanup[3] wiping threading # cleanup[3] wiping weakref<<< 7554 1726853147.61060: stdout chunk (state=3): >>> # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 7554 1726853147.61089: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum<<< 7554 1726853147.61114: stdout chunk (state=3): >>> # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools<<< 7554 1726853147.61192: stdout chunk (state=3): >>> # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 7554 1726853147.61217: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref<<< 7554 1726853147.61237: stdout chunk (state=3): >>> # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib<<< 7554 1726853147.61277: stdout chunk (state=3): >>> # cleanup[3] wiping sys # cleanup[3] wiping builtins<<< 7554 1726853147.61298: stdout chunk (state=3): >>> # destroy selinux._selinux # destroy systemd._daemon<<< 7554 1726853147.61363: stdout chunk (state=3): >>> # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 7554 1726853147.61619: stdout chunk (state=3): >>># destroy sys.monitoring <<< 7554 1726853147.61664: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 7554 1726853147.61724: stdout chunk (state=3): >>># destroy platform<<< 7554 1726853147.61754: stdout chunk (state=3): >>> # destroy _uuid # destroy stat<<< 7554 1726853147.61780: stdout chunk (state=3): >>> # destroy genericpath # destroy re._parser # destroy tokenize<<< 7554 1726853147.61815: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib # destroy copyreg <<< 7554 1726853147.61867: stdout chunk (state=3): >>># destroy contextlib # destroy _typing <<< 7554 1726853147.61929: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools<<< 7554 1726853147.61955: stdout chunk (state=3): >>> # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external <<< 7554 1726853147.62020: stdout chunk (state=3): >>># destroy _imp # destroy _io # destroy marshal # clear sys.meta_path<<< 7554 1726853147.62023: stdout chunk (state=3): >>> # clear sys.modules <<< 7554 1726853147.62212: stdout chunk (state=3): >>># destroy _frozen_importlib # destroy codecs # destroy encodings.aliases <<< 7554 1726853147.62280: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io<<< 7554 1726853147.62315: stdout chunk (state=3): >>> # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings<<< 7554 1726853147.62350: stdout chunk (state=3): >>> # destroy math # destroy _bisect # destroy time <<< 7554 1726853147.62409: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib<<< 7554 1726853147.62437: stdout chunk (state=3): >>> # destroy _operator # destroy _sre<<< 7554 1726853147.62525: stdout chunk (state=3): >>> # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins<<< 7554 1726853147.62528: stdout chunk (state=3): >>> # destroy _thread # clear sys.audit hooks<<< 7554 1726853147.62674: stdout chunk (state=3): >>> <<< 7554 1726853147.63139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853147.63143: stdout chunk (state=3): >>><<< 7554 1726853147.63145: stderr chunk (state=3): >>><<< 7554 1726853147.63313: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec451bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4518bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec451bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec451cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec451cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fabdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fabfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fe3800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fe3e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fc3aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fc11c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fa8f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45003770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45002390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fc2090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45000ad0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45038800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fa8200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45038cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45038b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45038ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44fa6d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45039550> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45039220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4503a450> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45050680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45051d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45052c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45053260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45052150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45053ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45053410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4503a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44d47bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44d70710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44d70470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44d70740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44d71070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44d71a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44d70920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44d45d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44d72de0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44d70ef0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4503aba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44d9b170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44dbf530> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44e202f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44e22a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44e203e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44de5310> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec447213a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44dbe330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44d73d40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fec44dbe690> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_4k73f3a2/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec447870e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44765fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44765130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44784f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec447b6a80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec447b6810> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec447b6120> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec447b68a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44787b00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec447b77a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec447b79b0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec447b7ec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44621c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44623890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44624290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44625400> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44627ec0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44fa6e10> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44626180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4462fc80> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4462e750> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4462e4e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4462ea20> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44626690> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44673ef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44674470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44675b50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44675910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec446780e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44676240> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4467b800> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec446781d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4467c5f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4467c9e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4467cad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44674230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec445081a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44509580> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4467e930> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4467fce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4467e570> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4450d670> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4450e540> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec445080e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4450e510> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4450f5f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4451a090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44515880> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44602840> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec447d6540> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44519dc0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4467cc80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec445ae1e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4416ffb0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec441745c0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4459a9c0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec445aed50> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec445ac8c0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec445ac530> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44177380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44176c30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44176e10> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44176090> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44177500> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec441da000> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44177f50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec445ac590> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec441dbb60> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec441dac90> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec44212330> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44202f00> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4422a270> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4422a1e0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec43fbeb70> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec43fbc110> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec43fbcd70> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44004fb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44004bf0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44006360> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec44005e20> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 3063, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 468, "free": 3063}, "nocache": {"free": 3337, "used": 194}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 291, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261815541760, "block_size": 4096, "block_total": 65519099, "block_available": 63919810, "block_used": 1599289, "inode_total": 131070960, "inode_available": 131029198, "inode_used": 41762, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "25", "second": "47", "epoch": "1726853147", "epoch_int": "1726853147", "date": "2024-09-20", "time": "13:25:47", "iso8601_micro": "2024-09-20T17:25:47.512642Z", "iso8601": "2024-09-20T17:25:47Z", "iso8601_basic": "20240920T132547512642", "iso8601_basic_short": "20240920T132547", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.1435546875, "5m": 0.189453125, "15m": 0.09423828125}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 7554 1726853147.64165: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853146.0933511-7589-155231366754809/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853147.64168: _low_level_execute_command(): starting 7554 1726853147.64173: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853146.0933511-7589-155231366754809/ > /dev/null 2>&1 && sleep 0' 7554 1726853147.64320: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853147.64323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853147.64325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853147.64327: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853147.64329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853147.64388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853147.64391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853147.64397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853147.64468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853147.67160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853147.67196: stderr chunk (state=3): >>><<< 7554 1726853147.67199: stdout chunk (state=3): >>><<< 7554 1726853147.67213: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853147.67220: handler run complete 7554 1726853147.67306: variable 'ansible_facts' from source: unknown 7554 1726853147.67368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853147.67648: variable 'ansible_facts' from source: unknown 7554 1726853147.67889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853147.67892: attempt loop complete, returning result 7554 1726853147.67894: _execute() done 7554 1726853147.67896: dumping result to json 7554 1726853147.67898: done dumping result, returning 7554 1726853147.67900: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [02083763-bbaf-bdc3-98b6-000000000155] 7554 1726853147.67902: sending task result for task 02083763-bbaf-bdc3-98b6-000000000155 7554 1726853147.68506: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000155 7554 1726853147.68516: WORKER PROCESS EXITING ok: [managed_node3] 7554 1726853147.68637: no more pending results, returning what we have 7554 1726853147.68641: results queue empty 7554 1726853147.68641: checking for any_errors_fatal 7554 1726853147.68643: done checking for any_errors_fatal 7554 1726853147.68643: checking for max_fail_percentage 7554 1726853147.68645: done checking for max_fail_percentage 7554 1726853147.68648: checking to see if all hosts have failed and the running result is not ok 7554 1726853147.68649: done checking to see if all hosts have failed 7554 1726853147.68650: getting the remaining hosts for this loop 7554 1726853147.68651: done getting the remaining hosts for this loop 7554 1726853147.68654: getting the next task for host managed_node3 7554 1726853147.68660: done getting next task for host managed_node3 7554 1726853147.68662: ^ task is: TASK: meta (flush_handlers) 7554 1726853147.68663: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853147.68677: getting variables 7554 1726853147.68679: in VariableManager get_vars() 7554 1726853147.68700: Calling all_inventory to load vars for managed_node3 7554 1726853147.68703: Calling groups_inventory to load vars for managed_node3 7554 1726853147.68706: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853147.68715: Calling all_plugins_play to load vars for managed_node3 7554 1726853147.68718: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853147.68721: Calling groups_plugins_play to load vars for managed_node3 7554 1726853147.68997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853147.69132: done with get_vars() 7554 1726853147.69145: done getting variables 7554 1726853147.69194: in VariableManager get_vars() 7554 1726853147.69201: Calling all_inventory to load vars for managed_node3 7554 1726853147.69202: Calling groups_inventory to load vars for managed_node3 7554 1726853147.69204: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853147.69207: Calling all_plugins_play to load vars for managed_node3 7554 1726853147.69209: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853147.69211: Calling groups_plugins_play to load vars for managed_node3 7554 1726853147.69291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853147.69408: done with get_vars() 7554 1726853147.69417: done queuing things up, now waiting for results queue to drain 7554 1726853147.69418: results queue empty 7554 1726853147.69419: checking for any_errors_fatal 7554 1726853147.69420: done checking for any_errors_fatal 7554 1726853147.69425: checking for max_fail_percentage 7554 1726853147.69426: done checking for max_fail_percentage 7554 1726853147.69427: checking to see if all hosts have failed and the running result is not ok 7554 1726853147.69428: done checking to see if all hosts have failed 7554 1726853147.69428: getting the remaining hosts for this loop 7554 1726853147.69429: done getting the remaining hosts for this loop 7554 1726853147.69431: getting the next task for host managed_node3 7554 1726853147.69435: done getting next task for host managed_node3 7554 1726853147.69436: ^ task is: TASK: Include the task 'el_repo_setup.yml' 7554 1726853147.69437: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853147.69439: getting variables 7554 1726853147.69439: in VariableManager get_vars() 7554 1726853147.69445: Calling all_inventory to load vars for managed_node3 7554 1726853147.69447: Calling groups_inventory to load vars for managed_node3 7554 1726853147.69448: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853147.69452: Calling all_plugins_play to load vars for managed_node3 7554 1726853147.69453: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853147.69456: Calling groups_plugins_play to load vars for managed_node3 7554 1726853147.69531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853147.69635: done with get_vars() 7554 1726853147.69640: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:11 Friday 20 September 2024 13:25:47 -0400 (0:00:01.653) 0:00:01.664 ****** 7554 1726853147.69696: entering _queue_task() for managed_node3/include_tasks 7554 1726853147.69698: Creating lock for include_tasks 7554 1726853147.69908: worker is 1 (out of 1 available) 7554 1726853147.69922: exiting _queue_task() for managed_node3/include_tasks 7554 1726853147.69931: done queuing things up, now waiting for results queue to drain 7554 1726853147.69934: waiting for pending results... 7554 1726853147.70079: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 7554 1726853147.70135: in run() - task 02083763-bbaf-bdc3-98b6-000000000006 7554 1726853147.70147: variable 'ansible_search_path' from source: unknown 7554 1726853147.70183: calling self._execute() 7554 1726853147.70235: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853147.70239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853147.70247: variable 'omit' from source: magic vars 7554 1726853147.70323: _execute() done 7554 1726853147.70327: dumping result to json 7554 1726853147.70330: done dumping result, returning 7554 1726853147.70333: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [02083763-bbaf-bdc3-98b6-000000000006] 7554 1726853147.70339: sending task result for task 02083763-bbaf-bdc3-98b6-000000000006 7554 1726853147.70427: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000006 7554 1726853147.70430: WORKER PROCESS EXITING 7554 1726853147.70468: no more pending results, returning what we have 7554 1726853147.70474: in VariableManager get_vars() 7554 1726853147.70503: Calling all_inventory to load vars for managed_node3 7554 1726853147.70506: Calling groups_inventory to load vars for managed_node3 7554 1726853147.70509: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853147.70520: Calling all_plugins_play to load vars for managed_node3 7554 1726853147.70522: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853147.70525: Calling groups_plugins_play to load vars for managed_node3 7554 1726853147.70712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853147.70893: done with get_vars() 7554 1726853147.70900: variable 'ansible_search_path' from source: unknown 7554 1726853147.70912: we have included files to process 7554 1726853147.70913: generating all_blocks data 7554 1726853147.70915: done generating all_blocks data 7554 1726853147.70915: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 7554 1726853147.70917: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 7554 1726853147.70919: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 7554 1726853147.71536: in VariableManager get_vars() 7554 1726853147.71550: done with get_vars() 7554 1726853147.71561: done processing included file 7554 1726853147.71562: iterating over new_blocks loaded from include file 7554 1726853147.71564: in VariableManager get_vars() 7554 1726853147.71575: done with get_vars() 7554 1726853147.71577: filtering new block on tags 7554 1726853147.71590: done filtering new block on tags 7554 1726853147.71593: in VariableManager get_vars() 7554 1726853147.71603: done with get_vars() 7554 1726853147.71604: filtering new block on tags 7554 1726853147.71617: done filtering new block on tags 7554 1726853147.71619: in VariableManager get_vars() 7554 1726853147.71627: done with get_vars() 7554 1726853147.71628: filtering new block on tags 7554 1726853147.71639: done filtering new block on tags 7554 1726853147.71641: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 7554 1726853147.71646: extending task lists for all hosts with included blocks 7554 1726853147.71692: done extending task lists 7554 1726853147.71694: done processing included files 7554 1726853147.71694: results queue empty 7554 1726853147.71695: checking for any_errors_fatal 7554 1726853147.71696: done checking for any_errors_fatal 7554 1726853147.71697: checking for max_fail_percentage 7554 1726853147.71698: done checking for max_fail_percentage 7554 1726853147.71699: checking to see if all hosts have failed and the running result is not ok 7554 1726853147.71700: done checking to see if all hosts have failed 7554 1726853147.71700: getting the remaining hosts for this loop 7554 1726853147.71702: done getting the remaining hosts for this loop 7554 1726853147.71704: getting the next task for host managed_node3 7554 1726853147.71707: done getting next task for host managed_node3 7554 1726853147.71709: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 7554 1726853147.71711: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853147.71713: getting variables 7554 1726853147.71714: in VariableManager get_vars() 7554 1726853147.71722: Calling all_inventory to load vars for managed_node3 7554 1726853147.71724: Calling groups_inventory to load vars for managed_node3 7554 1726853147.71725: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853147.71730: Calling all_plugins_play to load vars for managed_node3 7554 1726853147.71732: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853147.71734: Calling groups_plugins_play to load vars for managed_node3 7554 1726853147.71883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853147.72055: done with get_vars() 7554 1726853147.72063: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:25:47 -0400 (0:00:00.024) 0:00:01.688 ****** 7554 1726853147.72124: entering _queue_task() for managed_node3/setup 7554 1726853147.72410: worker is 1 (out of 1 available) 7554 1726853147.72421: exiting _queue_task() for managed_node3/setup 7554 1726853147.72432: done queuing things up, now waiting for results queue to drain 7554 1726853147.72433: waiting for pending results... 7554 1726853147.72639: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 7554 1726853147.72707: in run() - task 02083763-bbaf-bdc3-98b6-000000000166 7554 1726853147.72710: variable 'ansible_search_path' from source: unknown 7554 1726853147.72714: variable 'ansible_search_path' from source: unknown 7554 1726853147.72748: calling self._execute() 7554 1726853147.72797: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853147.72801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853147.72809: variable 'omit' from source: magic vars 7554 1726853147.73175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853147.75178: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853147.75183: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853147.75185: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853147.75196: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853147.75226: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853147.75380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853147.75460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853147.75486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853147.75539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853147.75563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853147.75781: variable 'ansible_facts' from source: unknown 7554 1726853147.75850: variable 'network_test_required_facts' from source: task vars 7554 1726853147.75879: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 7554 1726853147.75884: variable 'omit' from source: magic vars 7554 1726853147.75908: variable 'omit' from source: magic vars 7554 1726853147.75930: variable 'omit' from source: magic vars 7554 1726853147.75954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853147.75981: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853147.75996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853147.76009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853147.76017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853147.76039: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853147.76044: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853147.76049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853147.76115: Set connection var ansible_shell_executable to /bin/sh 7554 1726853147.76122: Set connection var ansible_pipelining to False 7554 1726853147.76125: Set connection var ansible_shell_type to sh 7554 1726853147.76127: Set connection var ansible_connection to ssh 7554 1726853147.76134: Set connection var ansible_timeout to 10 7554 1726853147.76139: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853147.76157: variable 'ansible_shell_executable' from source: unknown 7554 1726853147.76161: variable 'ansible_connection' from source: unknown 7554 1726853147.76165: variable 'ansible_module_compression' from source: unknown 7554 1726853147.76168: variable 'ansible_shell_type' from source: unknown 7554 1726853147.76170: variable 'ansible_shell_executable' from source: unknown 7554 1726853147.76174: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853147.76176: variable 'ansible_pipelining' from source: unknown 7554 1726853147.76178: variable 'ansible_timeout' from source: unknown 7554 1726853147.76181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853147.76273: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853147.76280: variable 'omit' from source: magic vars 7554 1726853147.76286: starting attempt loop 7554 1726853147.76290: running the handler 7554 1726853147.76301: _low_level_execute_command(): starting 7554 1726853147.76313: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853147.76764: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853147.76768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853147.76772: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853147.76820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853147.76824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853147.76899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7554 1726853147.78853: stdout chunk (state=3): >>>/root <<< 7554 1726853147.79077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853147.79080: stdout chunk (state=3): >>><<< 7554 1726853147.79085: stderr chunk (state=3): >>><<< 7554 1726853147.79091: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7554 1726853147.79101: _low_level_execute_command(): starting 7554 1726853147.79104: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853147.7901936-7664-1206347560579 `" && echo ansible-tmp-1726853147.7901936-7664-1206347560579="` echo /root/.ansible/tmp/ansible-tmp-1726853147.7901936-7664-1206347560579 `" ) && sleep 0' 7554 1726853147.80087: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853147.80096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853147.80128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853147.80150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853147.80170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853147.80260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853147.82907: stdout chunk (state=3): >>>ansible-tmp-1726853147.7901936-7664-1206347560579=/root/.ansible/tmp/ansible-tmp-1726853147.7901936-7664-1206347560579 <<< 7554 1726853147.83072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853147.83076: stdout chunk (state=3): >>><<< 7554 1726853147.83079: stderr chunk (state=3): >>><<< 7554 1726853147.83098: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853147.7901936-7664-1206347560579=/root/.ansible/tmp/ansible-tmp-1726853147.7901936-7664-1206347560579 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853147.83157: variable 'ansible_module_compression' from source: unknown 7554 1726853147.83277: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 7554 1726853147.83281: variable 'ansible_facts' from source: unknown 7554 1726853147.83496: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853147.7901936-7664-1206347560579/AnsiballZ_setup.py 7554 1726853147.83698: Sending initial data 7554 1726853147.83701: Sent initial data (150 bytes) 7554 1726853147.84114: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853147.84127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853147.84137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853147.84187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853147.84199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853147.84270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7554 1726853147.86659: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7554 1726853147.86663: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853147.86717: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853147.86800: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpmsnjld1j /root/.ansible/tmp/ansible-tmp-1726853147.7901936-7664-1206347560579/AnsiballZ_setup.py <<< 7554 1726853147.86804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853147.7901936-7664-1206347560579/AnsiballZ_setup.py" <<< 7554 1726853147.86845: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpmsnjld1j" to remote "/root/.ansible/tmp/ansible-tmp-1726853147.7901936-7664-1206347560579/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853147.7901936-7664-1206347560579/AnsiballZ_setup.py" <<< 7554 1726853147.88678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853147.88683: stdout chunk (state=3): >>><<< 7554 1726853147.88685: stderr chunk (state=3): >>><<< 7554 1726853147.88687: done transferring module to remote 7554 1726853147.88689: _low_level_execute_command(): starting 7554 1726853147.88691: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853147.7901936-7664-1206347560579/ /root/.ansible/tmp/ansible-tmp-1726853147.7901936-7664-1206347560579/AnsiballZ_setup.py && sleep 0' 7554 1726853147.89368: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853147.89429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853147.89478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853147.89534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853147.91465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853147.91469: stdout chunk (state=3): >>><<< 7554 1726853147.91473: stderr chunk (state=3): >>><<< 7554 1726853147.91568: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853147.91574: _low_level_execute_command(): starting 7554 1726853147.91577: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853147.7901936-7664-1206347560579/AnsiballZ_setup.py && sleep 0' 7554 1726853147.92210: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853147.92280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853147.92316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853147.92418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853147.94620: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 7554 1726853147.94643: stdout chunk (state=3): >>>import _imp # builtin <<< 7554 1726853147.94673: stdout chunk (state=3): >>>import '_thread' # <<< 7554 1726853147.94697: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 7554 1726853147.94756: stdout chunk (state=3): >>>import '_io' # <<< 7554 1726853147.94769: stdout chunk (state=3): >>>import 'marshal' # <<< 7554 1726853147.94789: stdout chunk (state=3): >>>import 'posix' # <<< 7554 1726853147.94835: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 7554 1726853147.94848: stdout chunk (state=3): >>># installing zipimport hook import 'time' # <<< 7554 1726853147.94880: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 7554 1726853147.94908: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 7554 1726853147.94919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853147.94949: stdout chunk (state=3): >>>import '_codecs' # <<< 7554 1726853147.94982: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 7554 1726853147.95018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadb00184d0> <<< 7554 1726853147.95021: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaffe7b30> <<< 7554 1726853147.95068: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 7554 1726853147.95094: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadb001aa50> <<< 7554 1726853147.95110: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # <<< 7554 1726853147.95127: stdout chunk (state=3): >>>import 'io' # <<< 7554 1726853147.95168: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 7554 1726853147.95252: stdout chunk (state=3): >>>import '_collections_abc' # <<< 7554 1726853147.95275: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 7554 1726853147.95328: stdout chunk (state=3): >>>import 'os' # <<< 7554 1726853147.95335: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 7554 1726853147.95359: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 7554 1726853147.95397: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 7554 1726853147.95420: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe2d130> <<< 7554 1726853147.95486: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853147.95534: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe2dfa0> <<< 7554 1726853147.95538: stdout chunk (state=3): >>>import 'site' # <<< 7554 1726853147.95556: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 7554 1726853147.95925: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 7554 1726853147.95969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 7554 1726853147.95975: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853147.96002: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 7554 1726853147.96028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 7554 1726853147.96046: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 7554 1726853147.96072: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 7554 1726853147.96129: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe6bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 7554 1726853147.96132: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 7554 1726853147.96164: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe6bf80> <<< 7554 1726853147.96167: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 7554 1726853147.96190: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 7554 1726853147.96220: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 7554 1726853147.96270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853147.96318: stdout chunk (state=3): >>>import 'itertools' # <<< 7554 1726853147.96324: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafea3830> <<< 7554 1726853147.96363: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafea3ec0> import '_collections' # <<< 7554 1726853147.96419: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe83b60> <<< 7554 1726853147.96429: stdout chunk (state=3): >>>import '_functools' # <<< 7554 1726853147.96464: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe812b0> <<< 7554 1726853147.96548: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe69070> <<< 7554 1726853147.96615: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 7554 1726853147.96618: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 7554 1726853147.96628: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 7554 1726853147.96659: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 7554 1726853147.96677: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 7554 1726853147.96700: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 7554 1726853147.96725: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafec37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafec23f0> <<< 7554 1726853147.96760: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 7554 1726853147.96772: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafec0bc0> <<< 7554 1726853147.96810: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 7554 1726853147.96851: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafef8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe682f0> <<< 7554 1726853147.96883: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 7554 1726853147.96896: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafef8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafef8bf0> <<< 7554 1726853147.96934: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853147.96972: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafef8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe66e10> <<< 7554 1726853147.97000: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853147.97003: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 7554 1726853147.97052: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 7554 1726853147.97095: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafef9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafef9370> <<< 7554 1726853147.97101: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 7554 1726853147.97124: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafefa540> import 'importlib.util' # <<< 7554 1726853147.97143: stdout chunk (state=3): >>>import 'runpy' # <<< 7554 1726853147.97162: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 7554 1726853147.97202: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 7554 1726853147.97238: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaff10740> <<< 7554 1726853147.97242: stdout chunk (state=3): >>>import 'errno' # <<< 7554 1726853147.97289: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaff11e20> <<< 7554 1726853147.97292: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 7554 1726853147.97345: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 7554 1726853147.97348: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaff12cc0> <<< 7554 1726853147.97392: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaff132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaff12210> <<< 7554 1726853147.97426: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 7554 1726853147.97477: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853147.97488: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaff13d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaff134a0> <<< 7554 1726853147.97558: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafefa4b0> <<< 7554 1726853147.97561: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 7554 1726853147.97583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 7554 1726853147.97602: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 7554 1726853147.97614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 7554 1726853147.97666: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853147.97695: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafc23c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 7554 1726853147.97730: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafc4c710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc4c470> <<< 7554 1726853147.97742: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafc4c740> <<< 7554 1726853147.97784: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 7554 1726853147.97852: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853147.97990: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853147.98009: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafc4d070> <<< 7554 1726853147.98132: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853147.98147: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafc4da60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc4c920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc21df0> <<< 7554 1726853147.98175: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 7554 1726853147.98210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 7554 1726853147.98249: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 7554 1726853147.98273: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc4ee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc4db50> <<< 7554 1726853147.98310: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafefac60> <<< 7554 1726853147.98313: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 7554 1726853147.98386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853147.98406: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 7554 1726853147.98423: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 7554 1726853147.98450: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc73170> <<< 7554 1726853147.98522: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 7554 1726853147.98525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853147.98562: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 7554 1726853147.98567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 7554 1726853147.98616: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc9b500> <<< 7554 1726853147.98620: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 7554 1726853147.98663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 7554 1726853147.98721: stdout chunk (state=3): >>>import 'ntpath' # <<< 7554 1726853147.98749: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 7554 1726853147.98761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafcfc260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 7554 1726853147.98797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 7554 1726853147.98820: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 7554 1726853147.98859: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 7554 1726853147.98942: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafcfe9c0> <<< 7554 1726853147.99032: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafcfc380> <<< 7554 1726853147.99055: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafcc1250> <<< 7554 1726853147.99090: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafafd340> <<< 7554 1726853147.99108: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc9a330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc4fd70> <<< 7554 1726853147.99291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 7554 1726853147.99310: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fadafafd5e0> <<< 7554 1726853147.99621: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_c17x78ax/ansible_setup_payload.zip' <<< 7554 1726853147.99625: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.99740: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853147.99775: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 7554 1726853147.99823: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 7554 1726853147.99922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 7554 1726853147.99942: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb670b0> import '_typing' # <<< 7554 1726853148.00141: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb45fa0> <<< 7554 1726853148.00183: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb45130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available <<< 7554 1726853148.00209: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 7554 1726853148.00234: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.02567: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.03961: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb64f80> <<< 7554 1726853148.03990: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853148.04058: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 7554 1726853148.04087: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafb96a50> <<< 7554 1726853148.04123: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb967e0> <<< 7554 1726853148.04159: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb960f0> <<< 7554 1726853148.04185: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 7554 1726853148.04223: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb96540> <<< 7554 1726853148.04226: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb67d40> <<< 7554 1726853148.04236: stdout chunk (state=3): >>>import 'atexit' # <<< 7554 1726853148.04464: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafb97800> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafb97a10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb97f50> <<< 7554 1726853148.04513: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 7554 1726853148.04564: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf529d90> <<< 7554 1726853148.04681: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf52b9b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 7554 1726853148.04695: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf52c380> <<< 7554 1726853148.04704: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 7554 1726853148.04748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 7554 1726853148.04763: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf52d520> <<< 7554 1726853148.04842: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 7554 1726853148.04875: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py<<< 7554 1726853148.04882: stdout chunk (state=3): >>> <<< 7554 1726853148.04898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 7554 1726853148.04992: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf52ff80> <<< 7554 1726853148.05048: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.05093: stdout chunk (state=3): >>> import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf5342f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf52e240><<< 7554 1726853148.05098: stdout chunk (state=3): >>> <<< 7554 1726853148.05125: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py<<< 7554 1726853148.05163: stdout chunk (state=3): >>> <<< 7554 1726853148.05183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 7554 1726853148.05212: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py<<< 7554 1726853148.05238: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 7554 1726853148.05359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 7554 1726853148.05466: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 7554 1726853148.05492: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 7554 1726853148.05504: stdout chunk (state=3): >>> <<< 7554 1726853148.05517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 7554 1726853148.05538: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf537fb0><<< 7554 1726853148.05543: stdout chunk (state=3): >>> <<< 7554 1726853148.05577: stdout chunk (state=3): >>>import '_tokenize' # <<< 7554 1726853148.05582: stdout chunk (state=3): >>> <<< 7554 1726853148.05696: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf536a80><<< 7554 1726853148.05699: stdout chunk (state=3): >>> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf5367e0><<< 7554 1726853148.05704: stdout chunk (state=3): >>> <<< 7554 1726853148.05727: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py<<< 7554 1726853148.05758: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc'<<< 7554 1726853148.05762: stdout chunk (state=3): >>> <<< 7554 1726853148.05885: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf536d50><<< 7554 1726853148.05936: stdout chunk (state=3): >>> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf52e750><<< 7554 1726853148.05938: stdout chunk (state=3): >>> <<< 7554 1726853148.05967: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.05997: stdout chunk (state=3): >>> # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.05999: stdout chunk (state=3): >>> import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf57c1d0><<< 7554 1726853148.06064: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 7554 1726853148.06067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853148.06074: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf57c3b0><<< 7554 1726853148.06118: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py<<< 7554 1726853148.06121: stdout chunk (state=3): >>> <<< 7554 1726853148.06154: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 7554 1726853148.06159: stdout chunk (state=3): >>> <<< 7554 1726853148.06188: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 7554 1726853148.06193: stdout chunk (state=3): >>> <<< 7554 1726853148.06279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.06285: stdout chunk (state=3): >>> <<< 7554 1726853148.06288: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf57ddc0><<< 7554 1726853148.06290: stdout chunk (state=3): >>> <<< 7554 1726853148.06296: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf57db80> <<< 7554 1726853148.06355: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 7554 1726853148.06389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 7554 1726853148.06456: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.06462: stdout chunk (state=3): >>> # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf580350><<< 7554 1726853148.06478: stdout chunk (state=3): >>> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf57e4b0><<< 7554 1726853148.06517: stdout chunk (state=3): >>> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py<<< 7554 1726853148.06523: stdout chunk (state=3): >>> <<< 7554 1726853148.06594: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc'<<< 7554 1726853148.06627: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 7554 1726853148.06659: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 7554 1726853148.06676: stdout chunk (state=3): >>> import '_string' # <<< 7554 1726853148.06682: stdout chunk (state=3): >>> <<< 7554 1726853148.06760: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf583b30><<< 7554 1726853148.06962: stdout chunk (state=3): >>> <<< 7554 1726853148.06979: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf580500> <<< 7554 1726853148.07098: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.07100: stdout chunk (state=3): >>> # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.07101: stdout chunk (state=3): >>> import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf584e30> <<< 7554 1726853148.07165: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.07181: stdout chunk (state=3): >>> # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.07184: stdout chunk (state=3): >>> <<< 7554 1726853148.07186: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf584ce0> <<< 7554 1726853148.07265: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.07268: stdout chunk (state=3): >>> <<< 7554 1726853148.07273: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.07284: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf584f80> <<< 7554 1726853148.07321: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf57c470><<< 7554 1726853148.07323: stdout chunk (state=3): >>> <<< 7554 1726853148.07386: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 7554 1726853148.07406: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 7554 1726853148.07438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 7554 1726853148.07470: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.07496: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf40c650> <<< 7554 1726853148.07676: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf40d760> <<< 7554 1726853148.07710: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf586db0> <<< 7554 1726853148.07731: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf587980> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf5869c0> # zipimport: zlib available <<< 7554 1726853148.07768: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 7554 1726853148.07784: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.07858: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.07978: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.07994: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 7554 1726853148.08025: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 7554 1726853148.08141: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.08567: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.09251: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.09255: stdout chunk (state=3): >>> <<< 7554 1726853148.10218: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 7554 1726853148.10225: stdout chunk (state=3): >>> <<< 7554 1726853148.10256: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 7554 1726853148.10258: stdout chunk (state=3): >>> <<< 7554 1726853148.10286: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 7554 1726853148.10325: stdout chunk (state=3): >>> # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py<<< 7554 1726853148.10330: stdout chunk (state=3): >>> <<< 7554 1726853148.10370: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc'<<< 7554 1726853148.10375: stdout chunk (state=3): >>> <<< 7554 1726853148.10459: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf415a30><<< 7554 1726853148.10587: stdout chunk (state=3): >>> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py<<< 7554 1726853148.10596: stdout chunk (state=3): >>> <<< 7554 1726853148.10611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc'<<< 7554 1726853148.10624: stdout chunk (state=3): >>> <<< 7554 1726853148.10650: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf416720><<< 7554 1726853148.10656: stdout chunk (state=3): >>> <<< 7554 1726853148.10681: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf40db80><<< 7554 1726853148.10686: stdout chunk (state=3): >>> <<< 7554 1726853148.10757: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 7554 1726853148.10763: stdout chunk (state=3): >>> <<< 7554 1726853148.10789: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.10797: stdout chunk (state=3): >>> <<< 7554 1726853148.10847: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.11076: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 7554 1726853148.11363: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.11414: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 7554 1726853148.11452: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc'<<< 7554 1726853148.11464: stdout chunk (state=3): >>> import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf416570> <<< 7554 1726853148.11501: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.12303: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.13137: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.13178: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.13301: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 7554 1726853148.13344: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.13401: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.13497: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 7554 1726853148.13501: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.13610: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.13744: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 7554 1726853148.13806: stdout chunk (state=3): >>> # zipimport: zlib available <<< 7554 1726853148.13810: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.13861: stdout chunk (state=3): >>> import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 7554 1726853148.13912: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.13964: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 7554 1726853148.14001: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.14160: stdout chunk (state=3): >>> <<< 7554 1726853148.14395: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.14401: stdout chunk (state=3): >>> <<< 7554 1726853148.14784: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 7554 1726853148.14894: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 7554 1726853148.14935: stdout chunk (state=3): >>>import '_ast' # <<< 7554 1726853148.15048: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf4179e0><<< 7554 1726853148.15051: stdout chunk (state=3): >>> <<< 7554 1726853148.15081: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.15201: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.15313: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 7554 1726853148.15323: stdout chunk (state=3): >>> <<< 7554 1726853148.15339: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 7554 1726853148.15364: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 7554 1726853148.15390: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 7554 1726853148.15430: stdout chunk (state=3): >>> # zipimport: zlib available<<< 7554 1726853148.15435: stdout chunk (state=3): >>> <<< 7554 1726853148.15494: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.15561: stdout chunk (state=3): >>> import 'ansible.module_utils.common.locale' # <<< 7554 1726853148.15565: stdout chunk (state=3): >>> <<< 7554 1726853148.15589: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.15594: stdout chunk (state=3): >>> <<< 7554 1726853148.15670: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.15745: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.15841: stdout chunk (state=3): >>> # zipimport: zlib available<<< 7554 1726853148.15844: stdout chunk (state=3): >>> <<< 7554 1726853148.15956: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py<<< 7554 1726853148.15965: stdout chunk (state=3): >>> <<< 7554 1726853148.16031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 7554 1726853148.16037: stdout chunk (state=3): >>> <<< 7554 1726853148.16170: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.16177: stdout chunk (state=3): >>> <<< 7554 1726853148.16200: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf422330><<< 7554 1726853148.16203: stdout chunk (state=3): >>> <<< 7554 1726853148.16257: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf41dc10> <<< 7554 1726853148.16288: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 7554 1726853148.16394: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 7554 1726853148.16399: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7554 1726853148.16488: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.16527: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.16583: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 7554 1726853148.16590: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853148.16661: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 7554 1726853148.16666: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 7554 1726853148.16751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 7554 1726853148.16777: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 7554 1726853148.16795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 7554 1726853148.16887: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf50ac30> <<< 7554 1726853148.16948: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafbc2900> <<< 7554 1726853148.17061: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf422420> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf414410> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 7554 1726853148.17112: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7554 1726853148.17150: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 7554 1726853148.17153: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 7554 1726853148.17222: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 7554 1726853148.17255: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7554 1726853148.17260: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 7554 1726853148.17357: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7554 1726853148.17457: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.17476: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.17505: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.17560: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.17618: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.17665: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.17717: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 7554 1726853148.17725: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.17846: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.17956: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.17992: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.18038: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 7554 1726853148.18561: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.18600: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.18657: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.18723: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 7554 1726853148.18729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853148.18756: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 7554 1726853148.18787: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 7554 1726853148.18799: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 7554 1726853148.18834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 7554 1726853148.18861: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf4b28d0> <<< 7554 1726853148.18894: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 7554 1726853148.18903: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 7554 1726853148.18934: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 7554 1726853148.18996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 7554 1726853148.19020: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 7554 1726853148.19035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 7554 1726853148.19058: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf050470> <<< 7554 1726853148.19085: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.19109: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf0506e0> <<< 7554 1726853148.19172: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf49c4a0> <<< 7554 1726853148.19364: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf4b3440> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf4b0fe0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf4b0b60> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 7554 1726853148.19387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 7554 1726853148.19423: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.19427: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf053740> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf052ff0> <<< 7554 1726853148.19460: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf0531d0> <<< 7554 1726853148.19487: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf052420> <<< 7554 1726853148.19506: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 7554 1726853148.19695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 7554 1726853148.19707: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf053890> <<< 7554 1726853148.19735: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 7554 1726853148.19782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 7554 1726853148.19852: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf0ae3c0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf0ac3e0> <<< 7554 1726853148.19884: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf4b0c80> <<< 7554 1726853148.19893: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 7554 1726853148.19901: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 7554 1726853148.19934: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.19942: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 7554 1726853148.20044: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7554 1726853148.20115: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 7554 1726853148.20147: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.20208: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.20281: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 7554 1726853148.20284: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.20308: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 7554 1726853148.20329: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.20369: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.20406: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 7554 1726853148.20412: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.20542: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # <<< 7554 1726853148.20547: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.20606: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.20656: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 7554 1726853148.20675: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.20750: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.20840: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.20910: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.20993: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 7554 1726853148.21154: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.21802: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.22515: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 7554 1726853148.22537: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.22610: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.22690: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.22731: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.22789: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 7554 1726853148.22792: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 7554 1726853148.22838: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.22881: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 7554 1726853148.22884: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.22961: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.23036: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 7554 1726853148.23059: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.23100: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.23138: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 7554 1726853148.23143: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.23183: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.23225: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 7554 1726853148.23232: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.23463: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 7554 1726853148.23479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 7554 1726853148.23504: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf0afd70> <<< 7554 1726853148.23538: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 7554 1726853148.23577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 7554 1726853148.23759: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf0af170> import 'ansible.module_utils.facts.system.local' # <<< 7554 1726853148.23777: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.23873: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.23973: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 7554 1726853148.23976: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.24110: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.24235: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 7554 1726853148.24248: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.24338: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.24439: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 7554 1726853148.24503: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7554 1726853148.24574: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 7554 1726853148.24639: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 7554 1726853148.24737: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.24829: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf0ee5a0> <<< 7554 1726853148.25129: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf0dc8c0> import 'ansible.module_utils.facts.system.python' # <<< 7554 1726853148.25231: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7554 1726853148.25298: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 7554 1726853148.25316: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.25438: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.25564: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.25740: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.25958: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 7554 1726853148.25981: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.26029: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.26087: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 7554 1726853148.26093: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.26208: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 7554 1726853148.26257: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.26285: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf102150> <<< 7554 1726853148.26288: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf0ee3f0> import 'ansible.module_utils.facts.system.user' # <<< 7554 1726853148.26307: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.26339: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 7554 1726853148.26356: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.26454: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # <<< 7554 1726853148.26468: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.26706: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.26939: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 7554 1726853148.26945: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.27101: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.27294: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7554 1726853148.27353: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 7554 1726853148.27366: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 7554 1726853148.27376: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.27397: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.27430: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.27644: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.27860: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 7554 1726853148.28068: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.28073: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.28089: stdout chunk (state=3): >>> <<< 7554 1726853148.28269: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 7554 1726853148.28280: stdout chunk (state=3): >>> <<< 7554 1726853148.28300: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.28305: stdout chunk (state=3): >>> <<< 7554 1726853148.28357: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.28363: stdout chunk (state=3): >>> <<< 7554 1726853148.28422: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.28425: stdout chunk (state=3): >>> <<< 7554 1726853148.29329: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.29335: stdout chunk (state=3): >>> <<< 7554 1726853148.30178: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 7554 1726853148.30189: stdout chunk (state=3): >>> <<< 7554 1726853148.30208: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 7554 1726853148.30216: stdout chunk (state=3): >>> <<< 7554 1726853148.30243: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.30251: stdout chunk (state=3): >>> <<< 7554 1726853148.30415: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.30422: stdout chunk (state=3): >>> <<< 7554 1726853148.30581: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 7554 1726853148.30611: stdout chunk (state=3): >>> # zipimport: zlib available<<< 7554 1726853148.30616: stdout chunk (state=3): >>> <<< 7554 1726853148.30916: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # <<< 7554 1726853148.30921: stdout chunk (state=3): >>> <<< 7554 1726853148.30947: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.31202: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.31452: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 7554 1726853148.31457: stdout chunk (state=3): >>> <<< 7554 1726853148.31482: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.31487: stdout chunk (state=3): >>> <<< 7554 1726853148.31517: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.31534: stdout chunk (state=3): >>> <<< 7554 1726853148.31537: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 7554 1726853148.31573: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.31641: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.31707: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 7554 1726853148.31857: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.31894: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.31900: stdout chunk (state=3): >>> <<< 7554 1726853148.32055: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.32066: stdout chunk (state=3): >>> <<< 7554 1726853148.32397: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.32403: stdout chunk (state=3): >>> <<< 7554 1726853148.32727: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 7554 1726853148.32738: stdout chunk (state=3): >>> <<< 7554 1726853148.32751: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 7554 1726853148.32757: stdout chunk (state=3): >>> <<< 7554 1726853148.32782: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.32787: stdout chunk (state=3): >>> <<< 7554 1726853148.32841: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.32848: stdout chunk (state=3): >>> <<< 7554 1726853148.32896: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 7554 1726853148.32929: stdout chunk (state=3): >>> # zipimport: zlib available <<< 7554 1726853148.32967: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.33002: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.network.dragonfly' # <<< 7554 1726853148.33011: stdout chunk (state=3): >>> <<< 7554 1726853148.33031: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.33145: stdout chunk (state=3): >>> # zipimport: zlib available<<< 7554 1726853148.33153: stdout chunk (state=3): >>> <<< 7554 1726853148.33258: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 7554 1726853148.33261: stdout chunk (state=3): >>> <<< 7554 1726853148.33461: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 7554 1726853148.33508: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 7554 1726853148.33513: stdout chunk (state=3): >>> <<< 7554 1726853148.33537: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.33541: stdout chunk (state=3): >>> <<< 7554 1726853148.33631: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.33637: stdout chunk (state=3): >>> <<< 7554 1726853148.33724: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 7554 1726853148.33729: stdout chunk (state=3): >>> <<< 7554 1726853148.33750: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.33957: stdout chunk (state=3): >>> <<< 7554 1726853148.34208: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.34211: stdout chunk (state=3): >>> <<< 7554 1726853148.34670: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 7554 1726853148.34677: stdout chunk (state=3): >>> <<< 7554 1726853148.34699: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.34704: stdout chunk (state=3): >>> <<< 7554 1726853148.34799: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.34802: stdout chunk (state=3): >>> <<< 7554 1726853148.34895: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 7554 1726853148.34899: stdout chunk (state=3): >>> <<< 7554 1726853148.34938: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.34991: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.34997: stdout chunk (state=3): >>> <<< 7554 1726853148.35052: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 7554 1726853148.35056: stdout chunk (state=3): >>> <<< 7554 1726853148.35086: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.35094: stdout chunk (state=3): >>> <<< 7554 1726853148.35139: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.35144: stdout chunk (state=3): >>> <<< 7554 1726853148.35194: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 7554 1726853148.35200: stdout chunk (state=3): >>> <<< 7554 1726853148.35224: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.35229: stdout chunk (state=3): >>> <<< 7554 1726853148.35283: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.35289: stdout chunk (state=3): >>> <<< 7554 1726853148.35356: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available<<< 7554 1726853148.35361: stdout chunk (state=3): >>> <<< 7554 1726853148.35600: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # <<< 7554 1726853148.35603: stdout chunk (state=3): >>> <<< 7554 1726853148.35633: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.35639: stdout chunk (state=3): >>> <<< 7554 1726853148.35665: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.35684: stdout chunk (state=3): >>> <<< 7554 1726853148.35687: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 7554 1726853148.35697: stdout chunk (state=3): >>> <<< 7554 1726853148.35707: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.35755: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.35794: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 7554 1726853148.35809: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.35835: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.35851: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.35896: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.35951: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.36047: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.36224: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 7554 1726853148.36301: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 7554 1726853148.36305: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.36566: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.36818: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 7554 1726853148.36854: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.36917: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 7554 1726853148.36924: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.37009: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.37043: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 7554 1726853148.37057: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.37138: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.37239: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 7554 1726853148.37250: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.37342: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.37515: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 7554 1726853148.37553: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.37735: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 7554 1726853148.37740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 7554 1726853148.37759: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 7554 1726853148.37779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 7554 1726853148.37810: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaef03980> <<< 7554 1726853148.37833: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaef00410> <<< 7554 1726853148.37889: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaef00650> <<< 7554 1726853148.39063: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_d<<< 7554 1726853148.39082: stdout chunk (state=3): >>>ir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "25", "second": "48", "epoch": "1726853148", "epoch_int": "1726853148", "date": "2024-09-20", "time": "13:25:48", "iso8601_micro": "2024-09-20T17:25:48.381383Z", "iso8601": "2024-09-20T17:25:48Z", "iso8601_basic": "20240920T132548381383", "iso8601_basic_short": "20240920T132548", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 7554 1726853148.39898: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 7554 1726853148.39936: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout<<< 7554 1726853148.39972: stdout chunk (state=3): >>> # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 <<< 7554 1726853148.39997: stdout chunk (state=3): >>># cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types<<< 7554 1726853148.40025: stdout chunk (state=3): >>> # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections<<< 7554 1726853148.40043: stdout chunk (state=3): >>> # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64<<< 7554 1726853148.40069: stdout chunk (state=3): >>> # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno<<< 7554 1726853148.40091: stdout chunk (state=3): >>> # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib<<< 7554 1726853148.40122: stdout chunk (state=3): >>> # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib<<< 7554 1726853148.40144: stdout chunk (state=3): >>> # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil<<< 7554 1726853148.40283: stdout chunk (state=3): >>> # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext <<< 7554 1726853148.40302: stdout chunk (state=3): >>> # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux <<< 7554 1726853148.40321: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.compat.version <<< 7554 1726853148.40338: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr<<< 7554 1726853148.40360: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys <<< 7554 1726853148.40363: stdout chunk (state=3): >>># cleanup[2] removing termios<<< 7554 1726853148.40383: stdout chunk (state=3): >>> # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base<<< 7554 1726853148.40399: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos<<< 7554 1726853148.40420: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux <<< 7554 1726853148.40443: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux<<< 7554 1726853148.40474: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace<<< 7554 1726853148.40488: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time<<< 7554 1726853148.40510: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr<<< 7554 1726853148.40527: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux<<< 7554 1726853148.40556: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin<<< 7554 1726853148.40580: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base<<< 7554 1726853148.40604: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna<<< 7554 1726853148.40652: stdout chunk (state=3): >>> <<< 7554 1726853148.41268: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 7554 1726853148.41275: stdout chunk (state=3): >>> <<< 7554 1726853148.41303: stdout chunk (state=3): >>># destroy importlib.machinery <<< 7554 1726853148.41360: stdout chunk (state=3): >>># destroy importlib._abc # destroy importlib.util # destroy _bz2<<< 7554 1726853148.41366: stdout chunk (state=3): >>> <<< 7554 1726853148.41377: stdout chunk (state=3): >>># destroy _compression<<< 7554 1726853148.41383: stdout chunk (state=3): >>> # destroy _lzma<<< 7554 1726853148.41410: stdout chunk (state=3): >>> # destroy _blake2<<< 7554 1726853148.41413: stdout chunk (state=3): >>> # destroy binascii<<< 7554 1726853148.41441: stdout chunk (state=3): >>> # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path<<< 7554 1726853148.41473: stdout chunk (state=3): >>> # destroy zipfile<<< 7554 1726853148.41481: stdout chunk (state=3): >>> <<< 7554 1726853148.41484: stdout chunk (state=3): >>># destroy pathlib <<< 7554 1726853148.41551: stdout chunk (state=3): >>># destroy zipfile._path.glob # destroy ipaddress # destroy ntpath<<< 7554 1726853148.41554: stdout chunk (state=3): >>> <<< 7554 1726853148.41585: stdout chunk (state=3): >>># destroy importlib <<< 7554 1726853148.41596: stdout chunk (state=3): >>># destroy zipimport <<< 7554 1726853148.41613: stdout chunk (state=3): >>># destroy __main__<<< 7554 1726853148.41630: stdout chunk (state=3): >>> # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder<<< 7554 1726853148.41648: stdout chunk (state=3): >>> # destroy json.encoder # destroy json.scanner<<< 7554 1726853148.41656: stdout chunk (state=3): >>> # destroy _json<<< 7554 1726853148.41678: stdout chunk (state=3): >>> # destroy grp<<< 7554 1726853148.41691: stdout chunk (state=3): >>> # destroy encodings # destroy _locale<<< 7554 1726853148.41702: stdout chunk (state=3): >>> <<< 7554 1726853148.41722: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal<<< 7554 1726853148.41736: stdout chunk (state=3): >>> # destroy _posixsubprocess<<< 7554 1726853148.41752: stdout chunk (state=3): >>> <<< 7554 1726853148.41814: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selinux<<< 7554 1726853148.41832: stdout chunk (state=3): >>> <<< 7554 1726853148.41834: stdout chunk (state=3): >>># destroy shutil <<< 7554 1726853148.41872: stdout chunk (state=3): >>># destroy distro <<< 7554 1726853148.41880: stdout chunk (state=3): >>># destroy distro.distro<<< 7554 1726853148.41936: stdout chunk (state=3): >>> # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors <<< 7554 1726853148.41959: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector <<< 7554 1726853148.41985: stdout chunk (state=3): >>># destroy multiprocessing <<< 7554 1726853148.41994: stdout chunk (state=3): >>># destroy multiprocessing.connection # destroy multiprocessing.pool<<< 7554 1726853148.42015: stdout chunk (state=3): >>> # destroy signal<<< 7554 1726853148.42021: stdout chunk (state=3): >>> # destroy pickle<<< 7554 1726853148.42061: stdout chunk (state=3): >>> # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle <<< 7554 1726853148.42091: stdout chunk (state=3): >>># destroy queue # destroy _heapq<<< 7554 1726853148.42112: stdout chunk (state=3): >>> # destroy _queue # destroy multiprocessing.process<<< 7554 1726853148.42121: stdout chunk (state=3): >>> # destroy unicodedata<<< 7554 1726853148.42139: stdout chunk (state=3): >>> # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors<<< 7554 1726853148.42188: stdout chunk (state=3): >>> # destroy _multiprocessing # destroy shlex <<< 7554 1726853148.42199: stdout chunk (state=3): >>># destroy fcntl<<< 7554 1726853148.42222: stdout chunk (state=3): >>> # destroy datetime<<< 7554 1726853148.42233: stdout chunk (state=3): >>> <<< 7554 1726853148.42243: stdout chunk (state=3): >>># destroy subprocess # destroy base64<<< 7554 1726853148.42282: stdout chunk (state=3): >>> # destroy _ssl<<< 7554 1726853148.42287: stdout chunk (state=3): >>> <<< 7554 1726853148.42327: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux<<< 7554 1726853148.42329: stdout chunk (state=3): >>> # destroy getpass<<< 7554 1726853148.42362: stdout chunk (state=3): >>> # destroy pwd # destroy termios # destroy errno <<< 7554 1726853148.42414: stdout chunk (state=3): >>># destroy json # destroy socket <<< 7554 1726853148.42424: stdout chunk (state=3): >>># destroy struct <<< 7554 1726853148.42450: stdout chunk (state=3): >>># destroy glob<<< 7554 1726853148.42459: stdout chunk (state=3): >>> # destroy fnmatch<<< 7554 1726853148.42540: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna <<< 7554 1726853148.42561: stdout chunk (state=3): >>># destroy stringprep <<< 7554 1726853148.42577: stdout chunk (state=3): >>># cleanup[3] wiping configparser <<< 7554 1726853148.42589: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 7554 1726853148.42607: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian<<< 7554 1726853148.42613: stdout chunk (state=3): >>> # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves<<< 7554 1726853148.42638: stdout chunk (state=3): >>> # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128<<< 7554 1726853148.42652: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime<<< 7554 1726853148.42682: stdout chunk (state=3): >>> # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize<<< 7554 1726853148.42689: stdout chunk (state=3): >>> # cleanup[3] wiping _tokenize<<< 7554 1726853148.42709: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 7554 1726853148.42714: stdout chunk (state=3): >>> # cleanup[3] wiping _typing<<< 7554 1726853148.42740: stdout chunk (state=3): >>> # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref<<< 7554 1726853148.42750: stdout chunk (state=3): >>> # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math<<< 7554 1726853148.42781: stdout chunk (state=3): >>> # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap<<< 7554 1726853148.42787: stdout chunk (state=3): >>> # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler<<< 7554 1726853148.42813: stdout chunk (state=3): >>> # destroy enum # cleanup[3] wiping copyreg<<< 7554 1726853148.42816: stdout chunk (state=3): >>> # cleanup[3] wiping re._parser<<< 7554 1726853148.42841: stdout chunk (state=3): >>> # cleanup[3] wiping _sre<<< 7554 1726853148.42857: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools<<< 7554 1726853148.42876: stdout chunk (state=3): >>> # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc<<< 7554 1726853148.42960: stdout chunk (state=3): >>> # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 7554 1726853148.43220: stdout chunk (state=3): >>># destroy sys.monitoring<<< 7554 1726853148.43229: stdout chunk (state=3): >>> <<< 7554 1726853148.43247: stdout chunk (state=3): >>># destroy _socket <<< 7554 1726853148.43284: stdout chunk (state=3): >>># destroy _collections <<< 7554 1726853148.43323: stdout chunk (state=3): >>># destroy platform<<< 7554 1726853148.43328: stdout chunk (state=3): >>> <<< 7554 1726853148.43348: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 7554 1726853148.43395: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib<<< 7554 1726853148.43410: stdout chunk (state=3): >>> <<< 7554 1726853148.43412: stdout chunk (state=3): >>># destroy copyreg<<< 7554 1726853148.43456: stdout chunk (state=3): >>> # destroy contextlib # destroy _typing <<< 7554 1726853148.43491: stdout chunk (state=3): >>># destroy _tokenize <<< 7554 1726853148.43494: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse<<< 7554 1726853148.43497: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request <<< 7554 1726853148.43518: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 7554 1726853148.43583: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path<<< 7554 1726853148.43596: stdout chunk (state=3): >>> # clear sys.modules<<< 7554 1726853148.43605: stdout chunk (state=3): >>> <<< 7554 1726853148.43730: stdout chunk (state=3): >>># destroy _frozen_importlib # destroy codecs<<< 7554 1726853148.43747: stdout chunk (state=3): >>> # destroy encodings.aliases<<< 7554 1726853148.43775: stdout chunk (state=3): >>> # destroy encodings.utf_8<<< 7554 1726853148.43778: stdout chunk (state=3): >>> # destroy encodings.utf_8_sig <<< 7554 1726853148.43797: stdout chunk (state=3): >>># destroy encodings.cp437<<< 7554 1726853148.43810: stdout chunk (state=3): >>> # destroy encodings.idna <<< 7554 1726853148.43825: stdout chunk (state=3): >>># destroy _codecs <<< 7554 1726853148.43849: stdout chunk (state=3): >>># destroy io <<< 7554 1726853148.43875: stdout chunk (state=3): >>># destroy traceback <<< 7554 1726853148.43879: stdout chunk (state=3): >>># destroy warnings # destroy weakref <<< 7554 1726853148.43905: stdout chunk (state=3): >>># destroy collections # destroy threading # destroy atexit # destroy _warnings<<< 7554 1726853148.43926: stdout chunk (state=3): >>> # destroy math # destroy _bisect # destroy time<<< 7554 1726853148.43965: stdout chunk (state=3): >>> # destroy _random <<< 7554 1726853148.43985: stdout chunk (state=3): >>># destroy _weakref <<< 7554 1726853148.44011: stdout chunk (state=3): >>># destroy _hashlib<<< 7554 1726853148.44040: stdout chunk (state=3): >>> # destroy _operator<<< 7554 1726853148.44050: stdout chunk (state=3): >>> # destroy _sre # destroy _string<<< 7554 1726853148.44082: stdout chunk (state=3): >>> # destroy re # destroy itertools <<< 7554 1726853148.44117: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools<<< 7554 1726853148.44125: stdout chunk (state=3): >>> # destroy builtins # destroy _thread<<< 7554 1726853148.44156: stdout chunk (state=3): >>> # clear sys.audit hooks<<< 7554 1726853148.44355: stdout chunk (state=3): >>> <<< 7554 1726853148.44681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853148.44714: stderr chunk (state=3): >>><<< 7554 1726853148.44717: stdout chunk (state=3): >>><<< 7554 1726853148.44826: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadb00184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaffe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadb001aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe2dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe6bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe6bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafea3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafea3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe83b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe69070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafec37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafec23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafec0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafef8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafef8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafef8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafef8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafe66e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafef9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafef9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafefa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaff10740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaff11e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaff12cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaff132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaff12210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaff13d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaff134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafefa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafc23c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafc4c710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc4c470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafc4c740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafc4d070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafc4da60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc4c920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc21df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc4ee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc4db50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafefac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc73170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc9b500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafcfc260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafcfe9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafcfc380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafcc1250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafafd340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc9a330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafc4fd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fadafafd5e0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_c17x78ax/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb670b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb45fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb45130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb64f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafb96a50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb967e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb960f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb96540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb67d40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafb97800> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadafb97a10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafb97f50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf529d90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf52b9b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf52c380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf52d520> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf52ff80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf5342f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf52e240> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf537fb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf536a80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf5367e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf536d50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf52e750> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf57c1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf57c3b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf57ddc0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf57db80> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf580350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf57e4b0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf583b30> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf580500> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf584e30> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf584ce0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf584f80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf57c470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf40c650> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf40d760> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf586db0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf587980> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf5869c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf415a30> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf416720> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf40db80> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf416570> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf4179e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf422330> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf41dc10> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf50ac30> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadafbc2900> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf422420> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf414410> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf4b28d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf050470> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf0506e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf49c4a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf4b3440> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf4b0fe0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf4b0b60> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf053740> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf052ff0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf0531d0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf052420> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf053890> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf0ae3c0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf0ac3e0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf4b0c80> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf0afd70> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf0af170> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf0ee5a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf0dc8c0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaf102150> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaf0ee3f0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fadaef03980> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaef00410> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fadaef00650> {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "25", "second": "48", "epoch": "1726853148", "epoch_int": "1726853148", "date": "2024-09-20", "time": "13:25:48", "iso8601_micro": "2024-09-20T17:25:48.381383Z", "iso8601": "2024-09-20T17:25:48Z", "iso8601_basic": "20240920T132548381383", "iso8601_basic_short": "20240920T132548", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 7554 1726853148.45664: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853147.7901936-7664-1206347560579/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853148.45667: _low_level_execute_command(): starting 7554 1726853148.45672: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853147.7901936-7664-1206347560579/ > /dev/null 2>&1 && sleep 0' 7554 1726853148.45676: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853148.45678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853148.45681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853148.45683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853148.45685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853148.45688: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853148.45690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853148.45692: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853148.45697: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853148.45702: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7554 1726853148.45704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853148.45707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853148.45722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853148.45725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853148.45727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853148.45788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853148.45792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853148.45796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853148.45862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853148.48449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853148.48475: stderr chunk (state=3): >>><<< 7554 1726853148.48478: stdout chunk (state=3): >>><<< 7554 1726853148.48492: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853148.48498: handler run complete 7554 1726853148.48530: variable 'ansible_facts' from source: unknown 7554 1726853148.48568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853148.48638: variable 'ansible_facts' from source: unknown 7554 1726853148.48670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853148.48703: attempt loop complete, returning result 7554 1726853148.48706: _execute() done 7554 1726853148.48709: dumping result to json 7554 1726853148.48717: done dumping result, returning 7554 1726853148.48725: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [02083763-bbaf-bdc3-98b6-000000000166] 7554 1726853148.48730: sending task result for task 02083763-bbaf-bdc3-98b6-000000000166 7554 1726853148.48859: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000166 7554 1726853148.48862: WORKER PROCESS EXITING ok: [managed_node3] 7554 1726853148.48952: no more pending results, returning what we have 7554 1726853148.48955: results queue empty 7554 1726853148.48956: checking for any_errors_fatal 7554 1726853148.48957: done checking for any_errors_fatal 7554 1726853148.48958: checking for max_fail_percentage 7554 1726853148.48959: done checking for max_fail_percentage 7554 1726853148.48960: checking to see if all hosts have failed and the running result is not ok 7554 1726853148.48961: done checking to see if all hosts have failed 7554 1726853148.48962: getting the remaining hosts for this loop 7554 1726853148.48963: done getting the remaining hosts for this loop 7554 1726853148.48966: getting the next task for host managed_node3 7554 1726853148.48975: done getting next task for host managed_node3 7554 1726853148.48977: ^ task is: TASK: Check if system is ostree 7554 1726853148.48979: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853148.48982: getting variables 7554 1726853148.48983: in VariableManager get_vars() 7554 1726853148.49008: Calling all_inventory to load vars for managed_node3 7554 1726853148.49010: Calling groups_inventory to load vars for managed_node3 7554 1726853148.49012: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853148.49021: Calling all_plugins_play to load vars for managed_node3 7554 1726853148.49025: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853148.49027: Calling groups_plugins_play to load vars for managed_node3 7554 1726853148.49165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853148.49284: done with get_vars() 7554 1726853148.49294: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:25:48 -0400 (0:00:00.772) 0:00:02.461 ****** 7554 1726853148.49361: entering _queue_task() for managed_node3/stat 7554 1726853148.49552: worker is 1 (out of 1 available) 7554 1726853148.49566: exiting _queue_task() for managed_node3/stat 7554 1726853148.49578: done queuing things up, now waiting for results queue to drain 7554 1726853148.49580: waiting for pending results... 7554 1726853148.49722: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 7554 1726853148.49787: in run() - task 02083763-bbaf-bdc3-98b6-000000000168 7554 1726853148.49797: variable 'ansible_search_path' from source: unknown 7554 1726853148.49801: variable 'ansible_search_path' from source: unknown 7554 1726853148.49830: calling self._execute() 7554 1726853148.49884: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853148.49888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853148.49897: variable 'omit' from source: magic vars 7554 1726853148.50221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853148.50397: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853148.50427: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853148.50451: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853148.50481: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853148.50555: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853148.50575: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853148.50596: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853148.50613: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853148.50704: Evaluated conditional (not __network_is_ostree is defined): True 7554 1726853148.50707: variable 'omit' from source: magic vars 7554 1726853148.50731: variable 'omit' from source: magic vars 7554 1726853148.50756: variable 'omit' from source: magic vars 7554 1726853148.50777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853148.50801: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853148.50815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853148.50828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853148.50836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853148.50859: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853148.50862: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853148.50865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853148.50935: Set connection var ansible_shell_executable to /bin/sh 7554 1726853148.50941: Set connection var ansible_pipelining to False 7554 1726853148.50944: Set connection var ansible_shell_type to sh 7554 1726853148.50949: Set connection var ansible_connection to ssh 7554 1726853148.50955: Set connection var ansible_timeout to 10 7554 1726853148.50959: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853148.50977: variable 'ansible_shell_executable' from source: unknown 7554 1726853148.50980: variable 'ansible_connection' from source: unknown 7554 1726853148.50982: variable 'ansible_module_compression' from source: unknown 7554 1726853148.50985: variable 'ansible_shell_type' from source: unknown 7554 1726853148.50987: variable 'ansible_shell_executable' from source: unknown 7554 1726853148.50989: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853148.50993: variable 'ansible_pipelining' from source: unknown 7554 1726853148.50995: variable 'ansible_timeout' from source: unknown 7554 1726853148.51004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853148.51099: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853148.51108: variable 'omit' from source: magic vars 7554 1726853148.51111: starting attempt loop 7554 1726853148.51114: running the handler 7554 1726853148.51127: _low_level_execute_command(): starting 7554 1726853148.51134: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853148.51636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853148.51639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853148.51642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853148.51644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853148.51649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853148.51688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853148.51700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853148.51776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853148.54091: stdout chunk (state=3): >>>/root <<< 7554 1726853148.54230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853148.54262: stderr chunk (state=3): >>><<< 7554 1726853148.54266: stdout chunk (state=3): >>><<< 7554 1726853148.54284: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853148.54294: _low_level_execute_command(): starting 7554 1726853148.54300: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853148.5428276-7688-225039034052008 `" && echo ansible-tmp-1726853148.5428276-7688-225039034052008="` echo /root/.ansible/tmp/ansible-tmp-1726853148.5428276-7688-225039034052008 `" ) && sleep 0' 7554 1726853148.54736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853148.54739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853148.54741: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853148.54743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853148.54745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853148.54796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853148.54807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853148.54810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853148.54875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853148.57630: stdout chunk (state=3): >>>ansible-tmp-1726853148.5428276-7688-225039034052008=/root/.ansible/tmp/ansible-tmp-1726853148.5428276-7688-225039034052008 <<< 7554 1726853148.57789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853148.57812: stderr chunk (state=3): >>><<< 7554 1726853148.57817: stdout chunk (state=3): >>><<< 7554 1726853148.57837: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853148.5428276-7688-225039034052008=/root/.ansible/tmp/ansible-tmp-1726853148.5428276-7688-225039034052008 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853148.57876: variable 'ansible_module_compression' from source: unknown 7554 1726853148.57917: ANSIBALLZ: Using lock for stat 7554 1726853148.57920: ANSIBALLZ: Acquiring lock 7554 1726853148.57923: ANSIBALLZ: Lock acquired: 140257826528224 7554 1726853148.57927: ANSIBALLZ: Creating module 7554 1726853148.65274: ANSIBALLZ: Writing module into payload 7554 1726853148.65337: ANSIBALLZ: Writing module 7554 1726853148.65354: ANSIBALLZ: Renaming module 7554 1726853148.65359: ANSIBALLZ: Done creating module 7554 1726853148.65375: variable 'ansible_facts' from source: unknown 7554 1726853148.65419: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853148.5428276-7688-225039034052008/AnsiballZ_stat.py 7554 1726853148.65522: Sending initial data 7554 1726853148.65526: Sent initial data (151 bytes) 7554 1726853148.65966: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853148.65975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853148.66004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853148.66007: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853148.66009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853148.66012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853148.66062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853148.66065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853148.66086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853148.66154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853148.68447: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7554 1726853148.68451: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853148.68507: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853148.68569: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpl_if0x5b /root/.ansible/tmp/ansible-tmp-1726853148.5428276-7688-225039034052008/AnsiballZ_stat.py <<< 7554 1726853148.68576: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853148.5428276-7688-225039034052008/AnsiballZ_stat.py" <<< 7554 1726853148.68634: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpl_if0x5b" to remote "/root/.ansible/tmp/ansible-tmp-1726853148.5428276-7688-225039034052008/AnsiballZ_stat.py" <<< 7554 1726853148.68636: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853148.5428276-7688-225039034052008/AnsiballZ_stat.py" <<< 7554 1726853148.69260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853148.69299: stderr chunk (state=3): >>><<< 7554 1726853148.69302: stdout chunk (state=3): >>><<< 7554 1726853148.69337: done transferring module to remote 7554 1726853148.69355: _low_level_execute_command(): starting 7554 1726853148.69358: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853148.5428276-7688-225039034052008/ /root/.ansible/tmp/ansible-tmp-1726853148.5428276-7688-225039034052008/AnsiballZ_stat.py && sleep 0' 7554 1726853148.69808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853148.69811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853148.69813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853148.69816: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853148.69819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853148.69860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853148.69863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853148.69934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853148.72560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853148.72586: stderr chunk (state=3): >>><<< 7554 1726853148.72591: stdout chunk (state=3): >>><<< 7554 1726853148.72609: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853148.72612: _low_level_execute_command(): starting 7554 1726853148.72616: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853148.5428276-7688-225039034052008/AnsiballZ_stat.py && sleep 0' 7554 1726853148.73037: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853148.73075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853148.73078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853148.73081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853148.73083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853148.73124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853148.73127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853148.73203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853148.76312: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin<<< 7554 1726853148.76315: stdout chunk (state=3): >>> <<< 7554 1726853148.76343: stdout chunk (state=3): >>>import '_thread' # <<< 7554 1726853148.76370: stdout chunk (state=3): >>> <<< 7554 1726853148.76387: stdout chunk (state=3): >>>import '_warnings' # <<< 7554 1726853148.76394: stdout chunk (state=3): >>>import '_weakref' # <<< 7554 1726853148.76499: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 7554 1726853148.76573: stdout chunk (state=3): >>> import 'posix' # <<< 7554 1726853148.76578: stdout chunk (state=3): >>> <<< 7554 1726853148.76638: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 7554 1726853148.76641: stdout chunk (state=3): >>> # installing zipimport hook<<< 7554 1726853148.76643: stdout chunk (state=3): >>> <<< 7554 1726853148.76667: stdout chunk (state=3): >>>import 'time' # <<< 7554 1726853148.76693: stdout chunk (state=3): >>> import 'zipimport' # # installed zipimport hook<<< 7554 1726853148.76779: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 7554 1726853148.76788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853148.76830: stdout chunk (state=3): >>>import '_codecs' # <<< 7554 1726853148.76873: stdout chunk (state=3): >>>import 'codecs' # <<< 7554 1726853148.76931: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py<<< 7554 1726853148.76963: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc'<<< 7554 1726853148.76987: stdout chunk (state=3): >>> <<< 7554 1726853148.76990: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b10dc4d0> <<< 7554 1726853148.77062: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b10abb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py<<< 7554 1726853148.77069: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 7554 1726853148.77072: stdout chunk (state=3): >>> <<< 7554 1726853148.77165: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b10dea50> import '_signal' # import '_abc' # import 'abc' # <<< 7554 1726853148.77177: stdout chunk (state=3): >>>import 'io' # <<< 7554 1726853148.77230: stdout chunk (state=3): >>>import '_stat' # <<< 7554 1726853148.77238: stdout chunk (state=3): >>>import 'stat' # <<< 7554 1726853148.77392: stdout chunk (state=3): >>>import '_collections_abc' # <<< 7554 1726853148.77432: stdout chunk (state=3): >>>import 'genericpath' # <<< 7554 1726853148.77451: stdout chunk (state=3): >>>import 'posixpath' # <<< 7554 1726853148.77503: stdout chunk (state=3): >>> import 'os' # <<< 7554 1726853148.77516: stdout chunk (state=3): >>> <<< 7554 1726853148.77530: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 7554 1726853148.77534: stdout chunk (state=3): >>> <<< 7554 1726853148.77558: stdout chunk (state=3): >>>Processing user site-packages<<< 7554 1726853148.77569: stdout chunk (state=3): >>> <<< 7554 1726853148.77583: stdout chunk (state=3): >>>Processing global site-packages <<< 7554 1726853148.77621: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages'<<< 7554 1726853148.77625: stdout chunk (state=3): >>> <<< 7554 1726853148.77676: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py<<< 7554 1726853148.77696: stdout chunk (state=3): >>> <<< 7554 1726853148.77698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 7554 1726853148.77735: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0e91130><<< 7554 1726853148.77810: stdout chunk (state=3): >>> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 7554 1726853148.77835: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853148.77857: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0e91fa0> <<< 7554 1726853148.77904: stdout chunk (state=3): >>>import 'site' # <<< 7554 1726853148.77956: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux <<< 7554 1726853148.77961: stdout chunk (state=3): >>>Type "help", "copyright", "credits" or "license" for more information. <<< 7554 1726853148.78352: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 7554 1726853148.78389: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py<<< 7554 1726853148.78403: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc'<<< 7554 1726853148.78406: stdout chunk (state=3): >>> <<< 7554 1726853148.78428: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py<<< 7554 1726853148.78455: stdout chunk (state=3): >>> <<< 7554 1726853148.78500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc'<<< 7554 1726853148.78533: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 7554 1726853148.78583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc'<<< 7554 1726853148.78602: stdout chunk (state=3): >>> import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ecfec0><<< 7554 1726853148.78607: stdout chunk (state=3): >>> <<< 7554 1726853148.78629: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py<<< 7554 1726853148.78663: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 7554 1726853148.78713: stdout chunk (state=3): >>>import '_operator' # <<< 7554 1726853148.78749: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ecff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py<<< 7554 1726853148.78754: stdout chunk (state=3): >>> <<< 7554 1726853148.78793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc'<<< 7554 1726853148.78830: stdout chunk (state=3): >>> # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 7554 1726853148.78911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc'<<< 7554 1726853148.78940: stdout chunk (state=3): >>> import 'itertools' # <<< 7554 1726853148.78944: stdout chunk (state=3): >>> <<< 7554 1726853148.78978: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py<<< 7554 1726853148.78981: stdout chunk (state=3): >>> <<< 7554 1726853148.79027: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f078c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py<<< 7554 1726853148.79030: stdout chunk (state=3): >>> <<< 7554 1726853148.79047: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc'<<< 7554 1726853148.79054: stdout chunk (state=3): >>> import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f07f50><<< 7554 1726853148.79081: stdout chunk (state=3): >>> import '_collections' # <<< 7554 1726853148.79086: stdout chunk (state=3): >>> <<< 7554 1726853148.79152: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ee7b60><<< 7554 1726853148.79178: stdout chunk (state=3): >>> import '_functools' # <<< 7554 1726853148.79228: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ee52b0><<< 7554 1726853148.79353: stdout chunk (state=3): >>> <<< 7554 1726853148.79380: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ecd070> <<< 7554 1726853148.79424: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 7554 1726853148.79460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc'<<< 7554 1726853148.79463: stdout chunk (state=3): >>> <<< 7554 1726853148.79487: stdout chunk (state=3): >>>import '_sre' # <<< 7554 1726853148.79527: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 7554 1726853148.79570: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 7554 1726853148.79598: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 7554 1726853148.79600: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 7554 1726853148.79636: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f27860> <<< 7554 1726853148.79661: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f26480> <<< 7554 1726853148.79685: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 7554 1726853148.79884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ee6180> <<< 7554 1726853148.79890: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f24c50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f588f0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ecc2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0f58da0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f58c50> <<< 7554 1726853148.79893: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.79895: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0f58fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ecae10> <<< 7554 1726853148.79934: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853148.79963: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 7554 1726853148.79998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 7554 1726853148.80028: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f59640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f59340> <<< 7554 1726853148.80033: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 7554 1726853148.80078: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 7554 1726853148.80090: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 7554 1726853148.80096: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f5a510> <<< 7554 1726853148.80121: stdout chunk (state=3): >>>import 'importlib.util' # <<< 7554 1726853148.80125: stdout chunk (state=3): >>>import 'runpy' # <<< 7554 1726853148.80162: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 7554 1726853148.80203: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 7554 1726853148.80237: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 7554 1726853148.80241: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f74740> <<< 7554 1726853148.80367: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0f75e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 7554 1726853148.80370: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f76d20> <<< 7554 1726853148.80404: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.80412: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0f77350> <<< 7554 1726853148.80418: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f76270> <<< 7554 1726853148.80447: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 7554 1726853148.80455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 7554 1726853148.80509: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.80517: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0f77d70> <<< 7554 1726853148.80522: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f774a0> <<< 7554 1726853148.80583: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f5a570> <<< 7554 1726853148.80657: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 7554 1726853148.80677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 7554 1726853148.80713: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.80716: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.80727: stdout chunk (state=3): >>>import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0cf3c80> <<< 7554 1726853148.80740: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 7554 1726853148.80956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc'<<< 7554 1726853148.80962: stdout chunk (state=3): >>> # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0d1c7d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d1c530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0d1c800> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.81099: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.81104: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0d1d130> <<< 7554 1726853148.81283: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.81291: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0d1daf0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d1c9e0> <<< 7554 1726853148.81318: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0cf1e20> <<< 7554 1726853148.81333: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 7554 1726853148.81378: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 7554 1726853148.81409: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 7554 1726853148.81428: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 7554 1726853148.81434: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d1eea0> <<< 7554 1726853148.81469: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d1dc10> <<< 7554 1726853148.81489: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f5ac60> <<< 7554 1726853148.81527: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 7554 1726853148.81861: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc'<<< 7554 1726853148.81866: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d47230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 7554 1726853148.81876: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d6b5f0> <<< 7554 1726853148.81902: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 7554 1726853148.81963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 7554 1726853148.82041: stdout chunk (state=3): >>>import 'ntpath' # <<< 7554 1726853148.82159: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0dcc3e0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 7554 1726853148.82201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 7554 1726853148.82325: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0dceb40> <<< 7554 1726853148.82440: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0dcc500> <<< 7554 1726853148.82490: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d8d400> <<< 7554 1726853148.82527: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0715460> <<< 7554 1726853148.82546: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d6a3f0> <<< 7554 1726853148.82555: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d1fe00> <<< 7554 1726853148.82724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 7554 1726853148.82753: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb1b0d6a750> <<< 7554 1726853148.83260: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_c3qvn2zz/ansible_stat_payload.zip' # zipimport: zlib available <<< 7554 1726853148.83322: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.83363: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 7554 1726853148.83370: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 7554 1726853148.83438: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 7554 1726853148.83547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 7554 1726853148.83585: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 7554 1726853148.83589: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b076b140> <<< 7554 1726853148.83605: stdout chunk (state=3): >>>import '_typing' # <<< 7554 1726853148.83877: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b074a030> <<< 7554 1726853148.83880: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0749190> # zipimport: zlib available <<< 7554 1726853148.83923: stdout chunk (state=3): >>>import 'ansible' # <<< 7554 1726853148.83928: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.83956: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.83966: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.84158: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 7554 1726853148.86199: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.88034: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 7554 1726853148.88038: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0769010> <<< 7554 1726853148.88061: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 7554 1726853148.88066: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853148.88095: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 7554 1726853148.88114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 7554 1726853148.88136: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 7554 1726853148.88141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 7554 1726853148.88176: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.88203: stdout chunk (state=3): >>> <<< 7554 1726853148.88207: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0792b10><<< 7554 1726853148.88209: stdout chunk (state=3): >>> <<< 7554 1726853148.88276: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b07928a0><<< 7554 1726853148.88281: stdout chunk (state=3): >>> <<< 7554 1726853148.88329: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b07921b0><<< 7554 1726853148.88334: stdout chunk (state=3): >>> <<< 7554 1726853148.88355: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py<<< 7554 1726853148.88386: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 7554 1726853148.88446: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0792c00><<< 7554 1726853148.88452: stdout chunk (state=3): >>> <<< 7554 1726853148.88477: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b076bdd0> import 'atexit' # <<< 7554 1726853148.88521: stdout chunk (state=3): >>> # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.88538: stdout chunk (state=3): >>> <<< 7554 1726853148.88541: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.88548: stdout chunk (state=3): >>> import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b07938c0><<< 7554 1726853148.88588: stdout chunk (state=3): >>> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.88607: stdout chunk (state=3): >>> # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.88612: stdout chunk (state=3): >>> import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0793b00><<< 7554 1726853148.88658: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 7554 1726853148.88739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc'<<< 7554 1726853148.88763: stdout chunk (state=3): >>> import '_locale' # <<< 7554 1726853148.88768: stdout chunk (state=3): >>> <<< 7554 1726853148.88828: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0793f80><<< 7554 1726853148.88856: stdout chunk (state=3): >>> import 'pwd' # <<< 7554 1726853148.88884: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py<<< 7554 1726853148.88929: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc'<<< 7554 1726853148.88934: stdout chunk (state=3): >>> <<< 7554 1726853148.88992: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0601df0><<< 7554 1726853148.88995: stdout chunk (state=3): >>> <<< 7554 1726853148.89023: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.89028: stdout chunk (state=3): >>> <<< 7554 1726853148.89072: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0603a10> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py<<< 7554 1726853148.89076: stdout chunk (state=3): >>> <<< 7554 1726853148.89094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 7554 1726853148.89153: stdout chunk (state=3): >>> import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0604350> <<< 7554 1726853148.89182: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 7554 1726853148.89235: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 7554 1726853148.89266: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b06054f0> <<< 7554 1726853148.89353: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 7554 1726853148.89382: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 7554 1726853148.89399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 7554 1726853148.89486: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0607fe0><<< 7554 1726853148.89491: stdout chunk (state=3): >>> <<< 7554 1726853148.89544: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.89553: stdout chunk (state=3): >>> # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.89590: stdout chunk (state=3): >>> import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b060c110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b06062a0><<< 7554 1726853148.89596: stdout chunk (state=3): >>> <<< 7554 1726853148.89621: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py<<< 7554 1726853148.89655: stdout chunk (state=3): >>> <<< 7554 1726853148.89682: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc'<<< 7554 1726853148.89685: stdout chunk (state=3): >>> <<< 7554 1726853148.89731: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc'<<< 7554 1726853148.89764: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 7554 1726853148.89813: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 7554 1726853148.89846: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 7554 1726853148.89859: stdout chunk (state=3): >>> <<< 7554 1726853148.89869: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc'<<< 7554 1726853148.89885: stdout chunk (state=3): >>> import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b060ffb0> <<< 7554 1726853148.90056: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b060ea80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b060e7e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 7554 1726853148.90162: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b060ed50> <<< 7554 1726853148.90207: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b06067b0> <<< 7554 1726853148.90243: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.90263: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.90268: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0657f50> <<< 7554 1726853148.90315: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853148.90320: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0658230> <<< 7554 1726853148.90367: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 7554 1726853148.90404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 7554 1726853148.90440: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 7554 1726853148.90456: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 7554 1726853148.90519: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.90522: stdout chunk (state=3): >>> <<< 7554 1726853148.90527: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0659cd0><<< 7554 1726853148.90567: stdout chunk (state=3): >>> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0659a90> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 7554 1726853148.90783: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 7554 1726853148.90850: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.90867: stdout chunk (state=3): >>> # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b065c260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b065a3c0><<< 7554 1726853148.90911: stdout chunk (state=3): >>> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 7554 1726853148.90984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853148.91014: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py<<< 7554 1726853148.91042: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 7554 1726853148.91049: stdout chunk (state=3): >>> <<< 7554 1726853148.91127: stdout chunk (state=3): >>>import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b065fa40><<< 7554 1726853148.91132: stdout chunk (state=3): >>> <<< 7554 1726853148.91325: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b065c410><<< 7554 1726853148.91332: stdout chunk (state=3): >>> <<< 7554 1726853148.91424: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.91427: stdout chunk (state=3): >>> <<< 7554 1726853148.91430: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0660a40><<< 7554 1726853148.91446: stdout chunk (state=3): >>> <<< 7554 1726853148.91487: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.91492: stdout chunk (state=3): >>> # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.91568: stdout chunk (state=3): >>> import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0660bf0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.91586: stdout chunk (state=3): >>> # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.91590: stdout chunk (state=3): >>> import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0660b00> <<< 7554 1726853148.91619: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0658350> <<< 7554 1726853148.91654: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py<<< 7554 1726853148.91691: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py<<< 7554 1726853148.91694: stdout chunk (state=3): >>> <<< 7554 1726853148.91733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 7554 1726853148.91778: stdout chunk (state=3): >>> # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.91821: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.91878: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b04ec260> <<< 7554 1726853148.92084: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.92102: stdout chunk (state=3): >>> <<< 7554 1726853148.92119: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.92122: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b04ed670><<< 7554 1726853148.92127: stdout chunk (state=3): >>> <<< 7554 1726853148.92198: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b06629f0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.92212: stdout chunk (state=3): >>> # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0663da0><<< 7554 1726853148.92218: stdout chunk (state=3): >>> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0662660><<< 7554 1726853148.92251: stdout chunk (state=3): >>> # zipimport: zlib available<<< 7554 1726853148.92260: stdout chunk (state=3): >>> <<< 7554 1726853148.92277: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.92294: stdout chunk (state=3): >>> import 'ansible.module_utils.compat' # <<< 7554 1726853148.92323: stdout chunk (state=3): >>> # zipimport: zlib available<<< 7554 1726853148.92328: stdout chunk (state=3): >>> <<< 7554 1726853148.92467: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.92554: stdout chunk (state=3): >>> <<< 7554 1726853148.92617: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.92623: stdout chunk (state=3): >>> <<< 7554 1726853148.92643: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.92656: stdout chunk (state=3): >>> <<< 7554 1726853148.92665: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 7554 1726853148.92695: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.92700: stdout chunk (state=3): >>> <<< 7554 1726853148.92721: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.92727: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 7554 1726853148.92741: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.92865: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.92983: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.93547: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.94102: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 7554 1726853148.94114: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 7554 1726853148.94117: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 7554 1726853148.94135: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 7554 1726853148.94164: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 7554 1726853148.94223: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 7554 1726853148.94228: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b04f57c0> <<< 7554 1726853148.94306: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 7554 1726853148.94317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 7554 1726853148.94324: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b04f6540> <<< 7554 1726853148.94338: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b04ed4c0> <<< 7554 1726853148.94397: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 7554 1726853148.94408: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.94425: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.94435: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 7554 1726853148.94445: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.94596: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.95159: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b04f6570> # zipimport: zlib available <<< 7554 1726853148.95536: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.96267: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.96376: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.96480: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 7554 1726853148.96493: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.96542: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.96659: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 7554 1726853148.96689: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.96815: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 7554 1726853148.96835: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.96846: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.96862: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 7554 1726853148.96873: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.96924: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.96981: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 7554 1726853148.96984: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.97341: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.97694: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 7554 1726853148.97784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 7554 1726853148.97798: stdout chunk (state=3): >>>import '_ast' # <<< 7554 1726853148.97890: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b04f7860> <<< 7554 1726853148.97954: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.98009: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.98104: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 7554 1726853148.98124: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 7554 1726853148.98127: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 7554 1726853148.98342: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 7554 1726853148.98384: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.98458: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.98555: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 7554 1726853148.98617: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 7554 1726853148.98623: stdout chunk (state=3): >>> <<< 7554 1726853148.98736: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 7554 1726853148.98809: stdout chunk (state=3): >>> # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0502330> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b04ff5f0> <<< 7554 1726853148.98857: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 7554 1726853148.98867: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 7554 1726853148.98892: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.98996: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.99093: stdout chunk (state=3): >>> # zipimport: zlib available<<< 7554 1726853148.99101: stdout chunk (state=3): >>> <<< 7554 1726853148.99139: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853148.99144: stdout chunk (state=3): >>> <<< 7554 1726853148.99209: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py<<< 7554 1726853148.99217: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc'<<< 7554 1726853148.99229: stdout chunk (state=3): >>> <<< 7554 1726853148.99260: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py<<< 7554 1726853148.99265: stdout chunk (state=3): >>> <<< 7554 1726853148.99306: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 7554 1726853148.99310: stdout chunk (state=3): >>> <<< 7554 1726853148.99345: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py<<< 7554 1726853148.99353: stdout chunk (state=3): >>> <<< 7554 1726853148.99440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc'<<< 7554 1726853148.99474: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 7554 1726853148.99503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc'<<< 7554 1726853148.99600: stdout chunk (state=3): >>> import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b05eea80><<< 7554 1726853148.99605: stdout chunk (state=3): >>> <<< 7554 1726853148.99672: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b07ce7b0><<< 7554 1726853148.99679: stdout chunk (state=3): >>> <<< 7554 1726853148.99793: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0502060><<< 7554 1726853148.99798: stdout chunk (state=3): >>> <<< 7554 1726853148.99812: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b04ed9d0> # destroy ansible.module_utils.distro <<< 7554 1726853148.99821: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # <<< 7554 1726853148.99848: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.99898: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853148.99933: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 7554 1726853149.00024: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 7554 1726853149.00048: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853149.00081: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853149.00094: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 7554 1726853149.00118: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853149.00334: stdout chunk (state=3): >>># zipimport: zlib available<<< 7554 1726853149.00337: stdout chunk (state=3): >>> <<< 7554 1726853149.00639: stdout chunk (state=3): >>># zipimport: zlib available <<< 7554 1726853149.00789: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 7554 1726853149.00953: stdout chunk (state=3): >>># destroy __main__ <<< 7554 1726853149.01303: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._<<< 7554 1726853149.01326: stdout chunk (state=3): >>> # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings<<< 7554 1726853149.01347: stdout chunk (state=3): >>> # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc<<< 7554 1726853149.01360: stdout chunk (state=3): >>> # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os <<< 7554 1726853149.01385: stdout chunk (state=3): >>># cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib <<< 7554 1726853149.01404: stdout chunk (state=3): >>># destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib<<< 7554 1726853149.01418: stdout chunk (state=3): >>> # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy <<< 7554 1726853149.01436: stdout chunk (state=3): >>># cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random <<< 7554 1726853149.01447: stdout chunk (state=3): >>># cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress<<< 7554 1726853149.01552: stdout chunk (state=3): >>> # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob <<< 7554 1726853149.01587: stdout chunk (state=3): >>># cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils <<< 7554 1726853149.01599: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 7554 1726853149.01974: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 7554 1726853149.01984: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 7554 1726853149.02000: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 7554 1726853149.02006: stdout chunk (state=3): >>># destroy _blake2 <<< 7554 1726853149.02173: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime <<< 7554 1726853149.02182: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 7554 1726853149.02193: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 7554 1726853149.02245: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 7554 1726853149.02288: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 7554 1726853149.02299: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 <<< 7554 1726853149.02307: stdout chunk (state=3): >>># cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 7554 1726853149.02314: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing<<< 7554 1726853149.02334: stdout chunk (state=3): >>> # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 7554 1726853149.02342: stdout chunk (state=3): >>># cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 7554 1726853149.02558: stdout chunk (state=3): >>># cleanup[3] wiping re._parser <<< 7554 1726853149.02564: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 7554 1726853149.02592: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 7554 1726853149.02609: stdout chunk (state=3): >>># destroy _collections <<< 7554 1726853149.02640: stdout chunk (state=3): >>># destroy platform <<< 7554 1726853149.02649: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 7554 1726853149.02674: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 7554 1726853149.02723: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize <<< 7554 1726853149.02726: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse <<< 7554 1726853149.02729: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 7554 1726853149.02731: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp <<< 7554 1726853149.02750: stdout chunk (state=3): >>># destroy _io # destroy marshal <<< 7554 1726853149.02757: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 7554 1726853149.02858: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig <<< 7554 1726853149.02881: stdout chunk (state=3): >>># destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback <<< 7554 1726853149.02884: stdout chunk (state=3): >>># destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 7554 1726853149.02890: stdout chunk (state=3): >>># destroy time <<< 7554 1726853149.02912: stdout chunk (state=3): >>># destroy _random <<< 7554 1726853149.02921: stdout chunk (state=3): >>># destroy _weakref <<< 7554 1726853149.02933: stdout chunk (state=3): >>># destroy _hashlib <<< 7554 1726853149.02950: stdout chunk (state=3): >>># destroy _operator # destroy _string # destroy re <<< 7554 1726853149.03156: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 7554 1726853149.03464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853149.03499: stderr chunk (state=3): >>><<< 7554 1726853149.03502: stdout chunk (state=3): >>><<< 7554 1726853149.03567: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b10dc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b10abb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b10dea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0e91130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0e91fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ecfec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ecff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f078c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f07f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ee7b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ee52b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ecd070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f27860> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f26480> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ee6180> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f24c50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f588f0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ecc2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0f58da0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f58c50> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0f58fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0ecae10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f59640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f59340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f5a510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f74740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0f75e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f76d20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0f77350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f76270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0f77d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f774a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f5a570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0cf3c80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0d1c7d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d1c530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0d1c800> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0d1d130> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0d1daf0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d1c9e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0cf1e20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d1eea0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d1dc10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0f5ac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d47230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d6b5f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0dcc3e0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0dceb40> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0dcc500> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d8d400> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0715460> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d6a3f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0d1fe00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb1b0d6a750> # zipimport: found 30 names in '/tmp/ansible_stat_payload_c3qvn2zz/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b076b140> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b074a030> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0749190> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0769010> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0792b10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b07928a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b07921b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0792c00> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b076bdd0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b07938c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0793b00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0793f80> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0601df0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0603a10> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0604350> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b06054f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0607fe0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b060c110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b06062a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b060ffb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b060ea80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b060e7e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b060ed50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b06067b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0657f50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0658230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0659cd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0659a90> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b065c260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b065a3c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b065fa40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b065c410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0660a40> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0660bf0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0660b00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0658350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b04ec260> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b04ed670> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b06629f0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0663da0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0662660> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b04f57c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b04f6540> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b04ed4c0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b04f6570> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b04f7860> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1b0502330> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b04ff5f0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b05eea80> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b07ce7b0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b0502060> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1b04ed9d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 7554 1726853149.04105: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853148.5428276-7688-225039034052008/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853149.04117: _low_level_execute_command(): starting 7554 1726853149.04119: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853148.5428276-7688-225039034052008/ > /dev/null 2>&1 && sleep 0' 7554 1726853149.04227: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853149.04243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853149.04249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853149.04296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853149.04299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853149.04305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853149.04364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853149.06256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853149.06262: stderr chunk (state=3): >>><<< 7554 1726853149.06265: stdout chunk (state=3): >>><<< 7554 1726853149.06284: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853149.06288: handler run complete 7554 1726853149.06303: attempt loop complete, returning result 7554 1726853149.06306: _execute() done 7554 1726853149.06308: dumping result to json 7554 1726853149.06311: done dumping result, returning 7554 1726853149.06318: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [02083763-bbaf-bdc3-98b6-000000000168] 7554 1726853149.06323: sending task result for task 02083763-bbaf-bdc3-98b6-000000000168 7554 1726853149.06437: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000168 7554 1726853149.06440: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 7554 1726853149.06514: no more pending results, returning what we have 7554 1726853149.06517: results queue empty 7554 1726853149.06517: checking for any_errors_fatal 7554 1726853149.06524: done checking for any_errors_fatal 7554 1726853149.06525: checking for max_fail_percentage 7554 1726853149.06526: done checking for max_fail_percentage 7554 1726853149.06527: checking to see if all hosts have failed and the running result is not ok 7554 1726853149.06528: done checking to see if all hosts have failed 7554 1726853149.06529: getting the remaining hosts for this loop 7554 1726853149.06530: done getting the remaining hosts for this loop 7554 1726853149.06533: getting the next task for host managed_node3 7554 1726853149.06538: done getting next task for host managed_node3 7554 1726853149.06540: ^ task is: TASK: Set flag to indicate system is ostree 7554 1726853149.06543: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853149.06549: getting variables 7554 1726853149.06550: in VariableManager get_vars() 7554 1726853149.06579: Calling all_inventory to load vars for managed_node3 7554 1726853149.06581: Calling groups_inventory to load vars for managed_node3 7554 1726853149.06584: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853149.06598: Calling all_plugins_play to load vars for managed_node3 7554 1726853149.06601: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853149.06604: Calling groups_plugins_play to load vars for managed_node3 7554 1726853149.06751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853149.06869: done with get_vars() 7554 1726853149.06879: done getting variables 7554 1726853149.06952: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:25:49 -0400 (0:00:00.576) 0:00:03.037 ****** 7554 1726853149.06974: entering _queue_task() for managed_node3/set_fact 7554 1726853149.06975: Creating lock for set_fact 7554 1726853149.07186: worker is 1 (out of 1 available) 7554 1726853149.07200: exiting _queue_task() for managed_node3/set_fact 7554 1726853149.07210: done queuing things up, now waiting for results queue to drain 7554 1726853149.07212: waiting for pending results... 7554 1726853149.07357: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 7554 1726853149.07420: in run() - task 02083763-bbaf-bdc3-98b6-000000000169 7554 1726853149.07434: variable 'ansible_search_path' from source: unknown 7554 1726853149.07438: variable 'ansible_search_path' from source: unknown 7554 1726853149.07464: calling self._execute() 7554 1726853149.07520: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.07525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.07533: variable 'omit' from source: magic vars 7554 1726853149.07913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853149.08079: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853149.08118: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853149.08141: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853149.08167: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853149.08231: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853149.08251: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853149.08267: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853149.08286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853149.08375: Evaluated conditional (not __network_is_ostree is defined): True 7554 1726853149.08379: variable 'omit' from source: magic vars 7554 1726853149.08404: variable 'omit' from source: magic vars 7554 1726853149.08487: variable '__ostree_booted_stat' from source: set_fact 7554 1726853149.08524: variable 'omit' from source: magic vars 7554 1726853149.08549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853149.08568: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853149.08584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853149.08597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853149.08606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853149.08629: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853149.08632: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.08634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.08704: Set connection var ansible_shell_executable to /bin/sh 7554 1726853149.08710: Set connection var ansible_pipelining to False 7554 1726853149.08713: Set connection var ansible_shell_type to sh 7554 1726853149.08715: Set connection var ansible_connection to ssh 7554 1726853149.08723: Set connection var ansible_timeout to 10 7554 1726853149.08727: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853149.08748: variable 'ansible_shell_executable' from source: unknown 7554 1726853149.08752: variable 'ansible_connection' from source: unknown 7554 1726853149.08755: variable 'ansible_module_compression' from source: unknown 7554 1726853149.08757: variable 'ansible_shell_type' from source: unknown 7554 1726853149.08759: variable 'ansible_shell_executable' from source: unknown 7554 1726853149.08761: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.08763: variable 'ansible_pipelining' from source: unknown 7554 1726853149.08765: variable 'ansible_timeout' from source: unknown 7554 1726853149.08768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.08838: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853149.08849: variable 'omit' from source: magic vars 7554 1726853149.08852: starting attempt loop 7554 1726853149.08854: running the handler 7554 1726853149.08863: handler run complete 7554 1726853149.08870: attempt loop complete, returning result 7554 1726853149.08874: _execute() done 7554 1726853149.08877: dumping result to json 7554 1726853149.08888: done dumping result, returning 7554 1726853149.08891: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [02083763-bbaf-bdc3-98b6-000000000169] 7554 1726853149.08893: sending task result for task 02083763-bbaf-bdc3-98b6-000000000169 7554 1726853149.08967: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000169 7554 1726853149.08970: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 7554 1726853149.09055: no more pending results, returning what we have 7554 1726853149.09057: results queue empty 7554 1726853149.09058: checking for any_errors_fatal 7554 1726853149.09063: done checking for any_errors_fatal 7554 1726853149.09064: checking for max_fail_percentage 7554 1726853149.09065: done checking for max_fail_percentage 7554 1726853149.09066: checking to see if all hosts have failed and the running result is not ok 7554 1726853149.09066: done checking to see if all hosts have failed 7554 1726853149.09067: getting the remaining hosts for this loop 7554 1726853149.09068: done getting the remaining hosts for this loop 7554 1726853149.09073: getting the next task for host managed_node3 7554 1726853149.09080: done getting next task for host managed_node3 7554 1726853149.09083: ^ task is: TASK: Fix CentOS6 Base repo 7554 1726853149.09085: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853149.09088: getting variables 7554 1726853149.09089: in VariableManager get_vars() 7554 1726853149.09120: Calling all_inventory to load vars for managed_node3 7554 1726853149.09122: Calling groups_inventory to load vars for managed_node3 7554 1726853149.09125: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853149.09133: Calling all_plugins_play to load vars for managed_node3 7554 1726853149.09135: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853149.09143: Calling groups_plugins_play to load vars for managed_node3 7554 1726853149.09281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853149.09395: done with get_vars() 7554 1726853149.09401: done getting variables 7554 1726853149.09489: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:25:49 -0400 (0:00:00.025) 0:00:03.062 ****** 7554 1726853149.09507: entering _queue_task() for managed_node3/copy 7554 1726853149.09695: worker is 1 (out of 1 available) 7554 1726853149.09708: exiting _queue_task() for managed_node3/copy 7554 1726853149.09719: done queuing things up, now waiting for results queue to drain 7554 1726853149.09721: waiting for pending results... 7554 1726853149.09862: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 7554 1726853149.09921: in run() - task 02083763-bbaf-bdc3-98b6-00000000016b 7554 1726853149.09933: variable 'ansible_search_path' from source: unknown 7554 1726853149.09936: variable 'ansible_search_path' from source: unknown 7554 1726853149.09965: calling self._execute() 7554 1726853149.10021: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.10024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.10032: variable 'omit' from source: magic vars 7554 1726853149.10359: variable 'ansible_distribution' from source: facts 7554 1726853149.10380: Evaluated conditional (ansible_distribution == 'CentOS'): True 7554 1726853149.10459: variable 'ansible_distribution_major_version' from source: facts 7554 1726853149.10465: Evaluated conditional (ansible_distribution_major_version == '6'): False 7554 1726853149.10468: when evaluation is False, skipping this task 7554 1726853149.10470: _execute() done 7554 1726853149.10475: dumping result to json 7554 1726853149.10477: done dumping result, returning 7554 1726853149.10485: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [02083763-bbaf-bdc3-98b6-00000000016b] 7554 1726853149.10488: sending task result for task 02083763-bbaf-bdc3-98b6-00000000016b 7554 1726853149.10578: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000016b 7554 1726853149.10580: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 7554 1726853149.10650: no more pending results, returning what we have 7554 1726853149.10653: results queue empty 7554 1726853149.10654: checking for any_errors_fatal 7554 1726853149.10657: done checking for any_errors_fatal 7554 1726853149.10658: checking for max_fail_percentage 7554 1726853149.10659: done checking for max_fail_percentage 7554 1726853149.10660: checking to see if all hosts have failed and the running result is not ok 7554 1726853149.10660: done checking to see if all hosts have failed 7554 1726853149.10661: getting the remaining hosts for this loop 7554 1726853149.10662: done getting the remaining hosts for this loop 7554 1726853149.10665: getting the next task for host managed_node3 7554 1726853149.10670: done getting next task for host managed_node3 7554 1726853149.10674: ^ task is: TASK: Include the task 'enable_epel.yml' 7554 1726853149.10677: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853149.10680: getting variables 7554 1726853149.10681: in VariableManager get_vars() 7554 1726853149.10703: Calling all_inventory to load vars for managed_node3 7554 1726853149.10706: Calling groups_inventory to load vars for managed_node3 7554 1726853149.10708: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853149.10717: Calling all_plugins_play to load vars for managed_node3 7554 1726853149.10719: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853149.10722: Calling groups_plugins_play to load vars for managed_node3 7554 1726853149.10830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853149.10965: done with get_vars() 7554 1726853149.10973: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:25:49 -0400 (0:00:00.015) 0:00:03.077 ****** 7554 1726853149.11031: entering _queue_task() for managed_node3/include_tasks 7554 1726853149.11224: worker is 1 (out of 1 available) 7554 1726853149.11236: exiting _queue_task() for managed_node3/include_tasks 7554 1726853149.11250: done queuing things up, now waiting for results queue to drain 7554 1726853149.11252: waiting for pending results... 7554 1726853149.11397: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 7554 1726853149.11452: in run() - task 02083763-bbaf-bdc3-98b6-00000000016c 7554 1726853149.11462: variable 'ansible_search_path' from source: unknown 7554 1726853149.11465: variable 'ansible_search_path' from source: unknown 7554 1726853149.11498: calling self._execute() 7554 1726853149.11552: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.11555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.11563: variable 'omit' from source: magic vars 7554 1726853149.11899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853149.13488: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853149.13528: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853149.13568: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853149.13595: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853149.13615: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853149.13675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853149.13695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853149.13713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853149.13738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853149.13752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853149.13830: variable '__network_is_ostree' from source: set_fact 7554 1726853149.13844: Evaluated conditional (not __network_is_ostree | d(false)): True 7554 1726853149.13850: _execute() done 7554 1726853149.13852: dumping result to json 7554 1726853149.13855: done dumping result, returning 7554 1726853149.13861: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [02083763-bbaf-bdc3-98b6-00000000016c] 7554 1726853149.13864: sending task result for task 02083763-bbaf-bdc3-98b6-00000000016c 7554 1726853149.13953: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000016c 7554 1726853149.13956: WORKER PROCESS EXITING 7554 1726853149.14000: no more pending results, returning what we have 7554 1726853149.14004: in VariableManager get_vars() 7554 1726853149.14035: Calling all_inventory to load vars for managed_node3 7554 1726853149.14037: Calling groups_inventory to load vars for managed_node3 7554 1726853149.14040: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853149.14052: Calling all_plugins_play to load vars for managed_node3 7554 1726853149.14055: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853149.14057: Calling groups_plugins_play to load vars for managed_node3 7554 1726853149.14215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853149.14400: done with get_vars() 7554 1726853149.14409: variable 'ansible_search_path' from source: unknown 7554 1726853149.14410: variable 'ansible_search_path' from source: unknown 7554 1726853149.14446: we have included files to process 7554 1726853149.14448: generating all_blocks data 7554 1726853149.14449: done generating all_blocks data 7554 1726853149.14455: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 7554 1726853149.14457: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 7554 1726853149.14460: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 7554 1726853149.15408: done processing included file 7554 1726853149.15410: iterating over new_blocks loaded from include file 7554 1726853149.15412: in VariableManager get_vars() 7554 1726853149.15425: done with get_vars() 7554 1726853149.15427: filtering new block on tags 7554 1726853149.15448: done filtering new block on tags 7554 1726853149.15452: in VariableManager get_vars() 7554 1726853149.15468: done with get_vars() 7554 1726853149.15470: filtering new block on tags 7554 1726853149.15485: done filtering new block on tags 7554 1726853149.15487: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 7554 1726853149.15493: extending task lists for all hosts with included blocks 7554 1726853149.15598: done extending task lists 7554 1726853149.15600: done processing included files 7554 1726853149.15601: results queue empty 7554 1726853149.15601: checking for any_errors_fatal 7554 1726853149.15604: done checking for any_errors_fatal 7554 1726853149.15605: checking for max_fail_percentage 7554 1726853149.15606: done checking for max_fail_percentage 7554 1726853149.15607: checking to see if all hosts have failed and the running result is not ok 7554 1726853149.15608: done checking to see if all hosts have failed 7554 1726853149.15608: getting the remaining hosts for this loop 7554 1726853149.15609: done getting the remaining hosts for this loop 7554 1726853149.15612: getting the next task for host managed_node3 7554 1726853149.15616: done getting next task for host managed_node3 7554 1726853149.15618: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 7554 1726853149.15621: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853149.15623: getting variables 7554 1726853149.15624: in VariableManager get_vars() 7554 1726853149.15632: Calling all_inventory to load vars for managed_node3 7554 1726853149.15634: Calling groups_inventory to load vars for managed_node3 7554 1726853149.15637: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853149.15642: Calling all_plugins_play to load vars for managed_node3 7554 1726853149.15650: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853149.15654: Calling groups_plugins_play to load vars for managed_node3 7554 1726853149.15814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853149.16004: done with get_vars() 7554 1726853149.16018: done getting variables 7554 1726853149.16082: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 7554 1726853149.16289: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:25:49 -0400 (0:00:00.053) 0:00:03.130 ****** 7554 1726853149.16339: entering _queue_task() for managed_node3/command 7554 1726853149.16341: Creating lock for command 7554 1726853149.16655: worker is 1 (out of 1 available) 7554 1726853149.16782: exiting _queue_task() for managed_node3/command 7554 1726853149.16793: done queuing things up, now waiting for results queue to drain 7554 1726853149.16795: waiting for pending results... 7554 1726853149.16992: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 7554 1726853149.17059: in run() - task 02083763-bbaf-bdc3-98b6-000000000186 7554 1726853149.17075: variable 'ansible_search_path' from source: unknown 7554 1726853149.17080: variable 'ansible_search_path' from source: unknown 7554 1726853149.17221: calling self._execute() 7554 1726853149.17224: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.17227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.17230: variable 'omit' from source: magic vars 7554 1726853149.17646: variable 'ansible_distribution' from source: facts 7554 1726853149.17665: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7554 1726853149.17798: variable 'ansible_distribution_major_version' from source: facts 7554 1726853149.17809: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7554 1726853149.17816: when evaluation is False, skipping this task 7554 1726853149.17824: _execute() done 7554 1726853149.17831: dumping result to json 7554 1726853149.17847: done dumping result, returning 7554 1726853149.17859: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [02083763-bbaf-bdc3-98b6-000000000186] 7554 1726853149.17869: sending task result for task 02083763-bbaf-bdc3-98b6-000000000186 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7554 1726853149.18113: no more pending results, returning what we have 7554 1726853149.18117: results queue empty 7554 1726853149.18118: checking for any_errors_fatal 7554 1726853149.18119: done checking for any_errors_fatal 7554 1726853149.18120: checking for max_fail_percentage 7554 1726853149.18122: done checking for max_fail_percentage 7554 1726853149.18123: checking to see if all hosts have failed and the running result is not ok 7554 1726853149.18125: done checking to see if all hosts have failed 7554 1726853149.18125: getting the remaining hosts for this loop 7554 1726853149.18127: done getting the remaining hosts for this loop 7554 1726853149.18131: getting the next task for host managed_node3 7554 1726853149.18138: done getting next task for host managed_node3 7554 1726853149.18141: ^ task is: TASK: Install yum-utils package 7554 1726853149.18145: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853149.18150: getting variables 7554 1726853149.18152: in VariableManager get_vars() 7554 1726853149.18302: Calling all_inventory to load vars for managed_node3 7554 1726853149.18305: Calling groups_inventory to load vars for managed_node3 7554 1726853149.18309: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853149.18317: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000186 7554 1726853149.18320: WORKER PROCESS EXITING 7554 1726853149.18333: Calling all_plugins_play to load vars for managed_node3 7554 1726853149.18336: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853149.18340: Calling groups_plugins_play to load vars for managed_node3 7554 1726853149.18646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853149.18850: done with get_vars() 7554 1726853149.18862: done getting variables 7554 1726853149.18975: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:25:49 -0400 (0:00:00.026) 0:00:03.157 ****** 7554 1726853149.19006: entering _queue_task() for managed_node3/package 7554 1726853149.19008: Creating lock for package 7554 1726853149.19394: worker is 1 (out of 1 available) 7554 1726853149.19404: exiting _queue_task() for managed_node3/package 7554 1726853149.19415: done queuing things up, now waiting for results queue to drain 7554 1726853149.19417: waiting for pending results... 7554 1726853149.19568: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 7554 1726853149.19665: in run() - task 02083763-bbaf-bdc3-98b6-000000000187 7554 1726853149.19677: variable 'ansible_search_path' from source: unknown 7554 1726853149.19681: variable 'ansible_search_path' from source: unknown 7554 1726853149.19709: calling self._execute() 7554 1726853149.19836: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.19839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.19848: variable 'omit' from source: magic vars 7554 1726853149.20108: variable 'ansible_distribution' from source: facts 7554 1726853149.20117: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7554 1726853149.20205: variable 'ansible_distribution_major_version' from source: facts 7554 1726853149.20209: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7554 1726853149.20212: when evaluation is False, skipping this task 7554 1726853149.20215: _execute() done 7554 1726853149.20217: dumping result to json 7554 1726853149.20220: done dumping result, returning 7554 1726853149.20228: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [02083763-bbaf-bdc3-98b6-000000000187] 7554 1726853149.20235: sending task result for task 02083763-bbaf-bdc3-98b6-000000000187 7554 1726853149.20317: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000187 7554 1726853149.20320: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7554 1726853149.20425: no more pending results, returning what we have 7554 1726853149.20430: results queue empty 7554 1726853149.20431: checking for any_errors_fatal 7554 1726853149.20434: done checking for any_errors_fatal 7554 1726853149.20435: checking for max_fail_percentage 7554 1726853149.20436: done checking for max_fail_percentage 7554 1726853149.20437: checking to see if all hosts have failed and the running result is not ok 7554 1726853149.20437: done checking to see if all hosts have failed 7554 1726853149.20438: getting the remaining hosts for this loop 7554 1726853149.20439: done getting the remaining hosts for this loop 7554 1726853149.20442: getting the next task for host managed_node3 7554 1726853149.20449: done getting next task for host managed_node3 7554 1726853149.20451: ^ task is: TASK: Enable EPEL 7 7554 1726853149.20454: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853149.20457: getting variables 7554 1726853149.20458: in VariableManager get_vars() 7554 1726853149.20484: Calling all_inventory to load vars for managed_node3 7554 1726853149.20487: Calling groups_inventory to load vars for managed_node3 7554 1726853149.20489: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853149.20495: Calling all_plugins_play to load vars for managed_node3 7554 1726853149.20497: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853149.20499: Calling groups_plugins_play to load vars for managed_node3 7554 1726853149.20600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853149.20719: done with get_vars() 7554 1726853149.20725: done getting variables 7554 1726853149.20764: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:25:49 -0400 (0:00:00.017) 0:00:03.175 ****** 7554 1726853149.20784: entering _queue_task() for managed_node3/command 7554 1726853149.20957: worker is 1 (out of 1 available) 7554 1726853149.20968: exiting _queue_task() for managed_node3/command 7554 1726853149.20980: done queuing things up, now waiting for results queue to drain 7554 1726853149.20982: waiting for pending results... 7554 1726853149.21136: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 7554 1726853149.21209: in run() - task 02083763-bbaf-bdc3-98b6-000000000188 7554 1726853149.21220: variable 'ansible_search_path' from source: unknown 7554 1726853149.21224: variable 'ansible_search_path' from source: unknown 7554 1726853149.21258: calling self._execute() 7554 1726853149.21312: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.21315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.21323: variable 'omit' from source: magic vars 7554 1726853149.21598: variable 'ansible_distribution' from source: facts 7554 1726853149.21608: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7554 1726853149.21695: variable 'ansible_distribution_major_version' from source: facts 7554 1726853149.21701: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7554 1726853149.21704: when evaluation is False, skipping this task 7554 1726853149.21707: _execute() done 7554 1726853149.21709: dumping result to json 7554 1726853149.21712: done dumping result, returning 7554 1726853149.21718: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [02083763-bbaf-bdc3-98b6-000000000188] 7554 1726853149.21723: sending task result for task 02083763-bbaf-bdc3-98b6-000000000188 7554 1726853149.21804: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000188 7554 1726853149.21807: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7554 1726853149.21849: no more pending results, returning what we have 7554 1726853149.21852: results queue empty 7554 1726853149.21853: checking for any_errors_fatal 7554 1726853149.21857: done checking for any_errors_fatal 7554 1726853149.21858: checking for max_fail_percentage 7554 1726853149.21860: done checking for max_fail_percentage 7554 1726853149.21860: checking to see if all hosts have failed and the running result is not ok 7554 1726853149.21861: done checking to see if all hosts have failed 7554 1726853149.21862: getting the remaining hosts for this loop 7554 1726853149.21863: done getting the remaining hosts for this loop 7554 1726853149.21866: getting the next task for host managed_node3 7554 1726853149.21881: done getting next task for host managed_node3 7554 1726853149.21883: ^ task is: TASK: Enable EPEL 8 7554 1726853149.21887: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853149.21890: getting variables 7554 1726853149.21891: in VariableManager get_vars() 7554 1726853149.21913: Calling all_inventory to load vars for managed_node3 7554 1726853149.21915: Calling groups_inventory to load vars for managed_node3 7554 1726853149.21917: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853149.21925: Calling all_plugins_play to load vars for managed_node3 7554 1726853149.21927: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853149.21930: Calling groups_plugins_play to load vars for managed_node3 7554 1726853149.22039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853149.22177: done with get_vars() 7554 1726853149.22184: done getting variables 7554 1726853149.22224: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:25:49 -0400 (0:00:00.014) 0:00:03.189 ****** 7554 1726853149.22242: entering _queue_task() for managed_node3/command 7554 1726853149.22410: worker is 1 (out of 1 available) 7554 1726853149.22423: exiting _queue_task() for managed_node3/command 7554 1726853149.22434: done queuing things up, now waiting for results queue to drain 7554 1726853149.22436: waiting for pending results... 7554 1726853149.22577: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 7554 1726853149.22650: in run() - task 02083763-bbaf-bdc3-98b6-000000000189 7554 1726853149.22659: variable 'ansible_search_path' from source: unknown 7554 1726853149.22664: variable 'ansible_search_path' from source: unknown 7554 1726853149.22692: calling self._execute() 7554 1726853149.22748: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.22752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.22758: variable 'omit' from source: magic vars 7554 1726853149.23025: variable 'ansible_distribution' from source: facts 7554 1726853149.23035: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7554 1726853149.23124: variable 'ansible_distribution_major_version' from source: facts 7554 1726853149.23128: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7554 1726853149.23131: when evaluation is False, skipping this task 7554 1726853149.23135: _execute() done 7554 1726853149.23138: dumping result to json 7554 1726853149.23140: done dumping result, returning 7554 1726853149.23149: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [02083763-bbaf-bdc3-98b6-000000000189] 7554 1726853149.23152: sending task result for task 02083763-bbaf-bdc3-98b6-000000000189 7554 1726853149.23231: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000189 7554 1726853149.23234: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7554 1726853149.23284: no more pending results, returning what we have 7554 1726853149.23287: results queue empty 7554 1726853149.23288: checking for any_errors_fatal 7554 1726853149.23293: done checking for any_errors_fatal 7554 1726853149.23294: checking for max_fail_percentage 7554 1726853149.23295: done checking for max_fail_percentage 7554 1726853149.23295: checking to see if all hosts have failed and the running result is not ok 7554 1726853149.23296: done checking to see if all hosts have failed 7554 1726853149.23297: getting the remaining hosts for this loop 7554 1726853149.23298: done getting the remaining hosts for this loop 7554 1726853149.23301: getting the next task for host managed_node3 7554 1726853149.23308: done getting next task for host managed_node3 7554 1726853149.23310: ^ task is: TASK: Enable EPEL 6 7554 1726853149.23313: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853149.23316: getting variables 7554 1726853149.23318: in VariableManager get_vars() 7554 1726853149.23339: Calling all_inventory to load vars for managed_node3 7554 1726853149.23341: Calling groups_inventory to load vars for managed_node3 7554 1726853149.23344: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853149.23354: Calling all_plugins_play to load vars for managed_node3 7554 1726853149.23357: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853149.23359: Calling groups_plugins_play to load vars for managed_node3 7554 1726853149.23466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853149.23580: done with get_vars() 7554 1726853149.23588: done getting variables 7554 1726853149.23626: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:25:49 -0400 (0:00:00.013) 0:00:03.204 ****** 7554 1726853149.23644: entering _queue_task() for managed_node3/copy 7554 1726853149.23808: worker is 1 (out of 1 available) 7554 1726853149.23819: exiting _queue_task() for managed_node3/copy 7554 1726853149.23829: done queuing things up, now waiting for results queue to drain 7554 1726853149.23831: waiting for pending results... 7554 1726853149.23977: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 7554 1726853149.24046: in run() - task 02083763-bbaf-bdc3-98b6-00000000018b 7554 1726853149.24062: variable 'ansible_search_path' from source: unknown 7554 1726853149.24066: variable 'ansible_search_path' from source: unknown 7554 1726853149.24093: calling self._execute() 7554 1726853149.24144: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.24152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.24162: variable 'omit' from source: magic vars 7554 1726853149.24418: variable 'ansible_distribution' from source: facts 7554 1726853149.24428: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7554 1726853149.24506: variable 'ansible_distribution_major_version' from source: facts 7554 1726853149.24510: Evaluated conditional (ansible_distribution_major_version == '6'): False 7554 1726853149.24513: when evaluation is False, skipping this task 7554 1726853149.24517: _execute() done 7554 1726853149.24520: dumping result to json 7554 1726853149.24522: done dumping result, returning 7554 1726853149.24529: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [02083763-bbaf-bdc3-98b6-00000000018b] 7554 1726853149.24534: sending task result for task 02083763-bbaf-bdc3-98b6-00000000018b 7554 1726853149.24621: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000018b 7554 1726853149.24625: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 7554 1726853149.24666: no more pending results, returning what we have 7554 1726853149.24670: results queue empty 7554 1726853149.24672: checking for any_errors_fatal 7554 1726853149.24676: done checking for any_errors_fatal 7554 1726853149.24677: checking for max_fail_percentage 7554 1726853149.24678: done checking for max_fail_percentage 7554 1726853149.24679: checking to see if all hosts have failed and the running result is not ok 7554 1726853149.24680: done checking to see if all hosts have failed 7554 1726853149.24681: getting the remaining hosts for this loop 7554 1726853149.24682: done getting the remaining hosts for this loop 7554 1726853149.24685: getting the next task for host managed_node3 7554 1726853149.24691: done getting next task for host managed_node3 7554 1726853149.24693: ^ task is: TASK: Set network provider to 'nm' 7554 1726853149.24695: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853149.24698: getting variables 7554 1726853149.24700: in VariableManager get_vars() 7554 1726853149.24721: Calling all_inventory to load vars for managed_node3 7554 1726853149.24723: Calling groups_inventory to load vars for managed_node3 7554 1726853149.24725: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853149.24733: Calling all_plugins_play to load vars for managed_node3 7554 1726853149.24735: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853149.24738: Calling groups_plugins_play to load vars for managed_node3 7554 1726853149.24881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853149.24991: done with get_vars() 7554 1726853149.24997: done getting variables 7554 1726853149.25035: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:13 Friday 20 September 2024 13:25:49 -0400 (0:00:00.014) 0:00:03.218 ****** 7554 1726853149.25054: entering _queue_task() for managed_node3/set_fact 7554 1726853149.25216: worker is 1 (out of 1 available) 7554 1726853149.25227: exiting _queue_task() for managed_node3/set_fact 7554 1726853149.25237: done queuing things up, now waiting for results queue to drain 7554 1726853149.25239: waiting for pending results... 7554 1726853149.25379: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 7554 1726853149.25431: in run() - task 02083763-bbaf-bdc3-98b6-000000000007 7554 1726853149.25443: variable 'ansible_search_path' from source: unknown 7554 1726853149.25476: calling self._execute() 7554 1726853149.25530: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.25535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.25542: variable 'omit' from source: magic vars 7554 1726853149.25615: variable 'omit' from source: magic vars 7554 1726853149.25635: variable 'omit' from source: magic vars 7554 1726853149.25661: variable 'omit' from source: magic vars 7554 1726853149.25697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853149.25724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853149.25741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853149.25755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853149.25765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853149.25795: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853149.25798: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.25801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.25866: Set connection var ansible_shell_executable to /bin/sh 7554 1726853149.25874: Set connection var ansible_pipelining to False 7554 1726853149.25877: Set connection var ansible_shell_type to sh 7554 1726853149.25879: Set connection var ansible_connection to ssh 7554 1726853149.25887: Set connection var ansible_timeout to 10 7554 1726853149.25891: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853149.25911: variable 'ansible_shell_executable' from source: unknown 7554 1726853149.25914: variable 'ansible_connection' from source: unknown 7554 1726853149.25917: variable 'ansible_module_compression' from source: unknown 7554 1726853149.25919: variable 'ansible_shell_type' from source: unknown 7554 1726853149.25922: variable 'ansible_shell_executable' from source: unknown 7554 1726853149.25924: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.25927: variable 'ansible_pipelining' from source: unknown 7554 1726853149.25929: variable 'ansible_timeout' from source: unknown 7554 1726853149.25932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.26033: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853149.26043: variable 'omit' from source: magic vars 7554 1726853149.26049: starting attempt loop 7554 1726853149.26052: running the handler 7554 1726853149.26062: handler run complete 7554 1726853149.26069: attempt loop complete, returning result 7554 1726853149.26073: _execute() done 7554 1726853149.26076: dumping result to json 7554 1726853149.26078: done dumping result, returning 7554 1726853149.26084: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [02083763-bbaf-bdc3-98b6-000000000007] 7554 1726853149.26091: sending task result for task 02083763-bbaf-bdc3-98b6-000000000007 7554 1726853149.26169: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000007 7554 1726853149.26174: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 7554 1726853149.26220: no more pending results, returning what we have 7554 1726853149.26223: results queue empty 7554 1726853149.26224: checking for any_errors_fatal 7554 1726853149.26230: done checking for any_errors_fatal 7554 1726853149.26231: checking for max_fail_percentage 7554 1726853149.26232: done checking for max_fail_percentage 7554 1726853149.26232: checking to see if all hosts have failed and the running result is not ok 7554 1726853149.26233: done checking to see if all hosts have failed 7554 1726853149.26234: getting the remaining hosts for this loop 7554 1726853149.26235: done getting the remaining hosts for this loop 7554 1726853149.26238: getting the next task for host managed_node3 7554 1726853149.26243: done getting next task for host managed_node3 7554 1726853149.26247: ^ task is: TASK: meta (flush_handlers) 7554 1726853149.26249: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853149.26253: getting variables 7554 1726853149.26254: in VariableManager get_vars() 7554 1726853149.26277: Calling all_inventory to load vars for managed_node3 7554 1726853149.26279: Calling groups_inventory to load vars for managed_node3 7554 1726853149.26282: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853149.26290: Calling all_plugins_play to load vars for managed_node3 7554 1726853149.26292: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853149.26294: Calling groups_plugins_play to load vars for managed_node3 7554 1726853149.26401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853149.26511: done with get_vars() 7554 1726853149.26518: done getting variables 7554 1726853149.26565: in VariableManager get_vars() 7554 1726853149.26573: Calling all_inventory to load vars for managed_node3 7554 1726853149.26575: Calling groups_inventory to load vars for managed_node3 7554 1726853149.26576: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853149.26579: Calling all_plugins_play to load vars for managed_node3 7554 1726853149.26580: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853149.26582: Calling groups_plugins_play to load vars for managed_node3 7554 1726853149.26664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853149.26790: done with get_vars() 7554 1726853149.26799: done queuing things up, now waiting for results queue to drain 7554 1726853149.26800: results queue empty 7554 1726853149.26801: checking for any_errors_fatal 7554 1726853149.26802: done checking for any_errors_fatal 7554 1726853149.26803: checking for max_fail_percentage 7554 1726853149.26803: done checking for max_fail_percentage 7554 1726853149.26804: checking to see if all hosts have failed and the running result is not ok 7554 1726853149.26804: done checking to see if all hosts have failed 7554 1726853149.26804: getting the remaining hosts for this loop 7554 1726853149.26805: done getting the remaining hosts for this loop 7554 1726853149.26807: getting the next task for host managed_node3 7554 1726853149.26809: done getting next task for host managed_node3 7554 1726853149.26810: ^ task is: TASK: meta (flush_handlers) 7554 1726853149.26811: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853149.26817: getting variables 7554 1726853149.26817: in VariableManager get_vars() 7554 1726853149.26822: Calling all_inventory to load vars for managed_node3 7554 1726853149.26823: Calling groups_inventory to load vars for managed_node3 7554 1726853149.26824: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853149.26827: Calling all_plugins_play to load vars for managed_node3 7554 1726853149.26829: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853149.26830: Calling groups_plugins_play to load vars for managed_node3 7554 1726853149.26912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853149.27016: done with get_vars() 7554 1726853149.27022: done getting variables 7554 1726853149.27052: in VariableManager get_vars() 7554 1726853149.27057: Calling all_inventory to load vars for managed_node3 7554 1726853149.27059: Calling groups_inventory to load vars for managed_node3 7554 1726853149.27060: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853149.27063: Calling all_plugins_play to load vars for managed_node3 7554 1726853149.27064: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853149.27066: Calling groups_plugins_play to load vars for managed_node3 7554 1726853149.27143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853149.27266: done with get_vars() 7554 1726853149.27275: done queuing things up, now waiting for results queue to drain 7554 1726853149.27276: results queue empty 7554 1726853149.27277: checking for any_errors_fatal 7554 1726853149.27277: done checking for any_errors_fatal 7554 1726853149.27278: checking for max_fail_percentage 7554 1726853149.27279: done checking for max_fail_percentage 7554 1726853149.27279: checking to see if all hosts have failed and the running result is not ok 7554 1726853149.27279: done checking to see if all hosts have failed 7554 1726853149.27280: getting the remaining hosts for this loop 7554 1726853149.27280: done getting the remaining hosts for this loop 7554 1726853149.27282: getting the next task for host managed_node3 7554 1726853149.27284: done getting next task for host managed_node3 7554 1726853149.27284: ^ task is: None 7554 1726853149.27285: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853149.27286: done queuing things up, now waiting for results queue to drain 7554 1726853149.27286: results queue empty 7554 1726853149.27287: checking for any_errors_fatal 7554 1726853149.27287: done checking for any_errors_fatal 7554 1726853149.27288: checking for max_fail_percentage 7554 1726853149.27288: done checking for max_fail_percentage 7554 1726853149.27289: checking to see if all hosts have failed and the running result is not ok 7554 1726853149.27289: done checking to see if all hosts have failed 7554 1726853149.27291: getting the next task for host managed_node3 7554 1726853149.27293: done getting next task for host managed_node3 7554 1726853149.27294: ^ task is: None 7554 1726853149.27295: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853149.27330: in VariableManager get_vars() 7554 1726853149.27353: done with get_vars() 7554 1726853149.27357: in VariableManager get_vars() 7554 1726853149.27368: done with get_vars() 7554 1726853149.27372: variable 'omit' from source: magic vars 7554 1726853149.27392: in VariableManager get_vars() 7554 1726853149.27407: done with get_vars() 7554 1726853149.27420: variable 'omit' from source: magic vars PLAY [Play for testing auto_gateway setting] *********************************** 7554 1726853149.27734: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 7554 1726853149.27754: getting the remaining hosts for this loop 7554 1726853149.27755: done getting the remaining hosts for this loop 7554 1726853149.27756: getting the next task for host managed_node3 7554 1726853149.27758: done getting next task for host managed_node3 7554 1726853149.27759: ^ task is: TASK: Gathering Facts 7554 1726853149.27760: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853149.27762: getting variables 7554 1726853149.27762: in VariableManager get_vars() 7554 1726853149.27775: Calling all_inventory to load vars for managed_node3 7554 1726853149.27776: Calling groups_inventory to load vars for managed_node3 7554 1726853149.27777: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853149.27780: Calling all_plugins_play to load vars for managed_node3 7554 1726853149.27789: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853149.27790: Calling groups_plugins_play to load vars for managed_node3 7554 1726853149.27875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853149.27981: done with get_vars() 7554 1726853149.27987: done getting variables 7554 1726853149.28011: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:3 Friday 20 September 2024 13:25:49 -0400 (0:00:00.029) 0:00:03.247 ****** 7554 1726853149.28025: entering _queue_task() for managed_node3/gather_facts 7554 1726853149.28200: worker is 1 (out of 1 available) 7554 1726853149.28211: exiting _queue_task() for managed_node3/gather_facts 7554 1726853149.28224: done queuing things up, now waiting for results queue to drain 7554 1726853149.28225: waiting for pending results... 7554 1726853149.28373: running TaskExecutor() for managed_node3/TASK: Gathering Facts 7554 1726853149.28439: in run() - task 02083763-bbaf-bdc3-98b6-0000000001b1 7554 1726853149.28453: variable 'ansible_search_path' from source: unknown 7554 1726853149.28483: calling self._execute() 7554 1726853149.28539: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.28542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.28554: variable 'omit' from source: magic vars 7554 1726853149.28812: variable 'ansible_distribution_major_version' from source: facts 7554 1726853149.28822: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853149.28832: variable 'omit' from source: magic vars 7554 1726853149.28847: variable 'omit' from source: magic vars 7554 1726853149.28874: variable 'omit' from source: magic vars 7554 1726853149.28903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853149.28929: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853149.29019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853149.29034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853149.29052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853149.29072: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853149.29076: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.29079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.29148: Set connection var ansible_shell_executable to /bin/sh 7554 1726853149.29158: Set connection var ansible_pipelining to False 7554 1726853149.29163: Set connection var ansible_shell_type to sh 7554 1726853149.29170: Set connection var ansible_connection to ssh 7554 1726853149.29178: Set connection var ansible_timeout to 10 7554 1726853149.29183: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853149.29199: variable 'ansible_shell_executable' from source: unknown 7554 1726853149.29201: variable 'ansible_connection' from source: unknown 7554 1726853149.29204: variable 'ansible_module_compression' from source: unknown 7554 1726853149.29206: variable 'ansible_shell_type' from source: unknown 7554 1726853149.29212: variable 'ansible_shell_executable' from source: unknown 7554 1726853149.29214: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853149.29216: variable 'ansible_pipelining' from source: unknown 7554 1726853149.29219: variable 'ansible_timeout' from source: unknown 7554 1726853149.29223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853149.29352: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853149.29359: variable 'omit' from source: magic vars 7554 1726853149.29363: starting attempt loop 7554 1726853149.29366: running the handler 7554 1726853149.29385: variable 'ansible_facts' from source: unknown 7554 1726853149.29400: _low_level_execute_command(): starting 7554 1726853149.29407: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853149.29924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853149.29928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853149.29931: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853149.29933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853149.29993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853149.29996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853149.30003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853149.30073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853149.32195: stdout chunk (state=3): >>>/root <<< 7554 1726853149.32285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853149.32319: stderr chunk (state=3): >>><<< 7554 1726853149.32322: stdout chunk (state=3): >>><<< 7554 1726853149.32342: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853149.32359: _low_level_execute_command(): starting 7554 1726853149.32362: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853149.3234267-7713-165480709039394 `" && echo ansible-tmp-1726853149.3234267-7713-165480709039394="` echo /root/.ansible/tmp/ansible-tmp-1726853149.3234267-7713-165480709039394 `" ) && sleep 0' 7554 1726853149.32812: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853149.32815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853149.32817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853149.32827: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853149.32829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853149.32880: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853149.32883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853149.32887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853149.32946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853149.34890: stdout chunk (state=3): >>>ansible-tmp-1726853149.3234267-7713-165480709039394=/root/.ansible/tmp/ansible-tmp-1726853149.3234267-7713-165480709039394 <<< 7554 1726853149.35004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853149.35026: stderr chunk (state=3): >>><<< 7554 1726853149.35029: stdout chunk (state=3): >>><<< 7554 1726853149.35043: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853149.3234267-7713-165480709039394=/root/.ansible/tmp/ansible-tmp-1726853149.3234267-7713-165480709039394 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853149.35068: variable 'ansible_module_compression' from source: unknown 7554 1726853149.35115: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 7554 1726853149.35161: variable 'ansible_facts' from source: unknown 7554 1726853149.35295: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853149.3234267-7713-165480709039394/AnsiballZ_setup.py 7554 1726853149.35396: Sending initial data 7554 1726853149.35399: Sent initial data (152 bytes) 7554 1726853149.35855: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853149.35858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853149.35861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853149.35863: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853149.35868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853149.35917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853149.35922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853149.35924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853149.35992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853149.38278: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7554 1726853149.38286: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853149.38338: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853149.38401: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp34edlodg /root/.ansible/tmp/ansible-tmp-1726853149.3234267-7713-165480709039394/AnsiballZ_setup.py <<< 7554 1726853149.38408: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853149.3234267-7713-165480709039394/AnsiballZ_setup.py" <<< 7554 1726853149.38469: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp34edlodg" to remote "/root/.ansible/tmp/ansible-tmp-1726853149.3234267-7713-165480709039394/AnsiballZ_setup.py" <<< 7554 1726853149.38473: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853149.3234267-7713-165480709039394/AnsiballZ_setup.py" <<< 7554 1726853149.39619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853149.39666: stderr chunk (state=3): >>><<< 7554 1726853149.39669: stdout chunk (state=3): >>><<< 7554 1726853149.39687: done transferring module to remote 7554 1726853149.39696: _low_level_execute_command(): starting 7554 1726853149.39701: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853149.3234267-7713-165480709039394/ /root/.ansible/tmp/ansible-tmp-1726853149.3234267-7713-165480709039394/AnsiballZ_setup.py && sleep 0' 7554 1726853149.40150: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853149.40154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853149.40156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853149.40158: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853149.40163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853149.40166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853149.40213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853149.40216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853149.40285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853149.42920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853149.42941: stderr chunk (state=3): >>><<< 7554 1726853149.42944: stdout chunk (state=3): >>><<< 7554 1726853149.42962: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853149.42965: _low_level_execute_command(): starting 7554 1726853149.42970: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853149.3234267-7713-165480709039394/AnsiballZ_setup.py && sleep 0' 7554 1726853149.43402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853149.43405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853149.43408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853149.43410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853149.43412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853149.43469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853149.43474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853149.43537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853150.20639: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "25", "second": "49", "epoch": "1726853149", "epoch_int": "1726853149", "date": "2024-09-20", "time": "13:25:49", "iso8601_micro": "2024-09-20T17:25:49.811036Z", "iso8601": "2024-09-20T17:25:49Z", "iso8601_basic": "20240920T132549811036", "iso8601_basic_short": "20240920T132549", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US<<< 7554 1726853150.20652: stdout chunk (state=3): >>>.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.1435546875, "5m": 0.189453125, "15m": 0.09423828125}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentat<<< 7554 1726853150.20655: stdout chunk (state=3): >>>ion_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 3014, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 517, "free": 3014}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 294, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261815091200, "block_size": 4096, "block_total": 65519099, "block_available": 63919700, "block_used": 1599399, "inode_total": 131070960, "inode_available": 131029195, "inode_used": 41765, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 7554 1726853150.23431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853150.23453: stderr chunk (state=3): >>><<< 7554 1726853150.23457: stdout chunk (state=3): >>><<< 7554 1726853150.23493: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "25", "second": "49", "epoch": "1726853149", "epoch_int": "1726853149", "date": "2024-09-20", "time": "13:25:49", "iso8601_micro": "2024-09-20T17:25:49.811036Z", "iso8601": "2024-09-20T17:25:49Z", "iso8601_basic": "20240920T132549811036", "iso8601_basic_short": "20240920T132549", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.1435546875, "5m": 0.189453125, "15m": 0.09423828125}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 3014, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 517, "free": 3014}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 294, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261815091200, "block_size": 4096, "block_total": 65519099, "block_available": 63919700, "block_used": 1599399, "inode_total": 131070960, "inode_available": 131029195, "inode_used": 41765, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853150.23690: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853149.3234267-7713-165480709039394/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853150.23705: _low_level_execute_command(): starting 7554 1726853150.23709: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853149.3234267-7713-165480709039394/ > /dev/null 2>&1 && sleep 0' 7554 1726853150.24155: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853150.24159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853150.24161: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853150.24163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853150.24165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853150.24212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853150.24215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853150.24221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853150.24286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853150.26844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853150.26870: stderr chunk (state=3): >>><<< 7554 1726853150.26875: stdout chunk (state=3): >>><<< 7554 1726853150.26888: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853150.26896: handler run complete 7554 1726853150.26975: variable 'ansible_facts' from source: unknown 7554 1726853150.27042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853150.27216: variable 'ansible_facts' from source: unknown 7554 1726853150.27283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853150.27357: attempt loop complete, returning result 7554 1726853150.27360: _execute() done 7554 1726853150.27363: dumping result to json 7554 1726853150.27385: done dumping result, returning 7554 1726853150.27392: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [02083763-bbaf-bdc3-98b6-0000000001b1] 7554 1726853150.27396: sending task result for task 02083763-bbaf-bdc3-98b6-0000000001b1 ok: [managed_node3] 7554 1726853150.27872: no more pending results, returning what we have 7554 1726853150.27874: results queue empty 7554 1726853150.27875: checking for any_errors_fatal 7554 1726853150.27876: done checking for any_errors_fatal 7554 1726853150.27876: checking for max_fail_percentage 7554 1726853150.27877: done checking for max_fail_percentage 7554 1726853150.27878: checking to see if all hosts have failed and the running result is not ok 7554 1726853150.27878: done checking to see if all hosts have failed 7554 1726853150.27879: getting the remaining hosts for this loop 7554 1726853150.27879: done getting the remaining hosts for this loop 7554 1726853150.27881: getting the next task for host managed_node3 7554 1726853150.27885: done getting next task for host managed_node3 7554 1726853150.27886: ^ task is: TASK: meta (flush_handlers) 7554 1726853150.27888: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853150.27896: getting variables 7554 1726853150.27897: in VariableManager get_vars() 7554 1726853150.27921: Calling all_inventory to load vars for managed_node3 7554 1726853150.27923: Calling groups_inventory to load vars for managed_node3 7554 1726853150.27924: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853150.27933: Calling all_plugins_play to load vars for managed_node3 7554 1726853150.27934: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853150.27937: Calling groups_plugins_play to load vars for managed_node3 7554 1726853150.28040: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000001b1 7554 1726853150.28043: WORKER PROCESS EXITING 7554 1726853150.28053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853150.28167: done with get_vars() 7554 1726853150.28177: done getting variables 7554 1726853150.28228: in VariableManager get_vars() 7554 1726853150.28240: Calling all_inventory to load vars for managed_node3 7554 1726853150.28242: Calling groups_inventory to load vars for managed_node3 7554 1726853150.28243: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853150.28247: Calling all_plugins_play to load vars for managed_node3 7554 1726853150.28249: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853150.28250: Calling groups_plugins_play to load vars for managed_node3 7554 1726853150.28331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853150.28435: done with get_vars() 7554 1726853150.28445: done queuing things up, now waiting for results queue to drain 7554 1726853150.28447: results queue empty 7554 1726853150.28447: checking for any_errors_fatal 7554 1726853150.28449: done checking for any_errors_fatal 7554 1726853150.28453: checking for max_fail_percentage 7554 1726853150.28454: done checking for max_fail_percentage 7554 1726853150.28455: checking to see if all hosts have failed and the running result is not ok 7554 1726853150.28455: done checking to see if all hosts have failed 7554 1726853150.28455: getting the remaining hosts for this loop 7554 1726853150.28456: done getting the remaining hosts for this loop 7554 1726853150.28458: getting the next task for host managed_node3 7554 1726853150.28460: done getting next task for host managed_node3 7554 1726853150.28462: ^ task is: TASK: Include the task 'show_interfaces.yml' 7554 1726853150.28463: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853150.28465: getting variables 7554 1726853150.28465: in VariableManager get_vars() 7554 1726853150.28477: Calling all_inventory to load vars for managed_node3 7554 1726853150.28479: Calling groups_inventory to load vars for managed_node3 7554 1726853150.28480: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853150.28483: Calling all_plugins_play to load vars for managed_node3 7554 1726853150.28484: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853150.28486: Calling groups_plugins_play to load vars for managed_node3 7554 1726853150.28582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853150.28689: done with get_vars() 7554 1726853150.28695: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:9 Friday 20 September 2024 13:25:50 -0400 (0:00:01.007) 0:00:04.254 ****** 7554 1726853150.28740: entering _queue_task() for managed_node3/include_tasks 7554 1726853150.28948: worker is 1 (out of 1 available) 7554 1726853150.28962: exiting _queue_task() for managed_node3/include_tasks 7554 1726853150.28974: done queuing things up, now waiting for results queue to drain 7554 1726853150.28976: waiting for pending results... 7554 1726853150.29130: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7554 1726853150.29189: in run() - task 02083763-bbaf-bdc3-98b6-00000000000b 7554 1726853150.29205: variable 'ansible_search_path' from source: unknown 7554 1726853150.29233: calling self._execute() 7554 1726853150.29294: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853150.29297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853150.29311: variable 'omit' from source: magic vars 7554 1726853150.29583: variable 'ansible_distribution_major_version' from source: facts 7554 1726853150.29594: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853150.29600: _execute() done 7554 1726853150.29603: dumping result to json 7554 1726853150.29605: done dumping result, returning 7554 1726853150.29611: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-bdc3-98b6-00000000000b] 7554 1726853150.29617: sending task result for task 02083763-bbaf-bdc3-98b6-00000000000b 7554 1726853150.29703: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000000b 7554 1726853150.29706: WORKER PROCESS EXITING 7554 1726853150.29766: no more pending results, returning what we have 7554 1726853150.29770: in VariableManager get_vars() 7554 1726853150.29818: Calling all_inventory to load vars for managed_node3 7554 1726853150.29820: Calling groups_inventory to load vars for managed_node3 7554 1726853150.29823: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853150.29831: Calling all_plugins_play to load vars for managed_node3 7554 1726853150.29833: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853150.29835: Calling groups_plugins_play to load vars for managed_node3 7554 1726853150.29944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853150.30057: done with get_vars() 7554 1726853150.30063: variable 'ansible_search_path' from source: unknown 7554 1726853150.30075: we have included files to process 7554 1726853150.30076: generating all_blocks data 7554 1726853150.30077: done generating all_blocks data 7554 1726853150.30078: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7554 1726853150.30079: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7554 1726853150.30081: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7554 1726853150.30181: in VariableManager get_vars() 7554 1726853150.30200: done with get_vars() 7554 1726853150.30273: done processing included file 7554 1726853150.30275: iterating over new_blocks loaded from include file 7554 1726853150.30276: in VariableManager get_vars() 7554 1726853150.30289: done with get_vars() 7554 1726853150.30290: filtering new block on tags 7554 1726853150.30303: done filtering new block on tags 7554 1726853150.30304: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7554 1726853150.30307: extending task lists for all hosts with included blocks 7554 1726853150.32556: done extending task lists 7554 1726853150.32558: done processing included files 7554 1726853150.32558: results queue empty 7554 1726853150.32559: checking for any_errors_fatal 7554 1726853150.32560: done checking for any_errors_fatal 7554 1726853150.32560: checking for max_fail_percentage 7554 1726853150.32561: done checking for max_fail_percentage 7554 1726853150.32561: checking to see if all hosts have failed and the running result is not ok 7554 1726853150.32562: done checking to see if all hosts have failed 7554 1726853150.32562: getting the remaining hosts for this loop 7554 1726853150.32563: done getting the remaining hosts for this loop 7554 1726853150.32565: getting the next task for host managed_node3 7554 1726853150.32567: done getting next task for host managed_node3 7554 1726853150.32569: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7554 1726853150.32572: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853150.32574: getting variables 7554 1726853150.32575: in VariableManager get_vars() 7554 1726853150.32591: Calling all_inventory to load vars for managed_node3 7554 1726853150.32593: Calling groups_inventory to load vars for managed_node3 7554 1726853150.32594: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853150.32599: Calling all_plugins_play to load vars for managed_node3 7554 1726853150.32600: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853150.32602: Calling groups_plugins_play to load vars for managed_node3 7554 1726853150.32840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853150.32948: done with get_vars() 7554 1726853150.32955: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:25:50 -0400 (0:00:00.042) 0:00:04.297 ****** 7554 1726853150.33006: entering _queue_task() for managed_node3/include_tasks 7554 1726853150.33227: worker is 1 (out of 1 available) 7554 1726853150.33239: exiting _queue_task() for managed_node3/include_tasks 7554 1726853150.33251: done queuing things up, now waiting for results queue to drain 7554 1726853150.33253: waiting for pending results... 7554 1726853150.33411: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7554 1726853150.33491: in run() - task 02083763-bbaf-bdc3-98b6-0000000001ca 7554 1726853150.33499: variable 'ansible_search_path' from source: unknown 7554 1726853150.33503: variable 'ansible_search_path' from source: unknown 7554 1726853150.33530: calling self._execute() 7554 1726853150.33597: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853150.33600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853150.33609: variable 'omit' from source: magic vars 7554 1726853150.33982: variable 'ansible_distribution_major_version' from source: facts 7554 1726853150.33986: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853150.33992: _execute() done 7554 1726853150.33996: dumping result to json 7554 1726853150.33999: done dumping result, returning 7554 1726853150.34004: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-bdc3-98b6-0000000001ca] 7554 1726853150.34015: sending task result for task 02083763-bbaf-bdc3-98b6-0000000001ca 7554 1726853150.34139: no more pending results, returning what we have 7554 1726853150.34144: in VariableManager get_vars() 7554 1726853150.34292: Calling all_inventory to load vars for managed_node3 7554 1726853150.34295: Calling groups_inventory to load vars for managed_node3 7554 1726853150.34298: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853150.34311: Calling all_plugins_play to load vars for managed_node3 7554 1726853150.34313: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853150.34317: Calling groups_plugins_play to load vars for managed_node3 7554 1726853150.34474: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000001ca 7554 1726853150.34478: WORKER PROCESS EXITING 7554 1726853150.34500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853150.34700: done with get_vars() 7554 1726853150.34708: variable 'ansible_search_path' from source: unknown 7554 1726853150.34710: variable 'ansible_search_path' from source: unknown 7554 1726853150.34745: we have included files to process 7554 1726853150.34749: generating all_blocks data 7554 1726853150.34750: done generating all_blocks data 7554 1726853150.34751: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7554 1726853150.34753: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7554 1726853150.34755: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7554 1726853150.35092: done processing included file 7554 1726853150.35094: iterating over new_blocks loaded from include file 7554 1726853150.35095: in VariableManager get_vars() 7554 1726853150.35109: done with get_vars() 7554 1726853150.35110: filtering new block on tags 7554 1726853150.35120: done filtering new block on tags 7554 1726853150.35122: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7554 1726853150.35125: extending task lists for all hosts with included blocks 7554 1726853150.35189: done extending task lists 7554 1726853150.35190: done processing included files 7554 1726853150.35191: results queue empty 7554 1726853150.35191: checking for any_errors_fatal 7554 1726853150.35193: done checking for any_errors_fatal 7554 1726853150.35194: checking for max_fail_percentage 7554 1726853150.35195: done checking for max_fail_percentage 7554 1726853150.35195: checking to see if all hosts have failed and the running result is not ok 7554 1726853150.35196: done checking to see if all hosts have failed 7554 1726853150.35196: getting the remaining hosts for this loop 7554 1726853150.35197: done getting the remaining hosts for this loop 7554 1726853150.35198: getting the next task for host managed_node3 7554 1726853150.35201: done getting next task for host managed_node3 7554 1726853150.35202: ^ task is: TASK: Gather current interface info 7554 1726853150.35204: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853150.35206: getting variables 7554 1726853150.35206: in VariableManager get_vars() 7554 1726853150.35217: Calling all_inventory to load vars for managed_node3 7554 1726853150.35218: Calling groups_inventory to load vars for managed_node3 7554 1726853150.35224: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853150.35229: Calling all_plugins_play to load vars for managed_node3 7554 1726853150.35231: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853150.35233: Calling groups_plugins_play to load vars for managed_node3 7554 1726853150.35313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853150.35421: done with get_vars() 7554 1726853150.35427: done getting variables 7554 1726853150.35454: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:25:50 -0400 (0:00:00.024) 0:00:04.322 ****** 7554 1726853150.35478: entering _queue_task() for managed_node3/command 7554 1726853150.35672: worker is 1 (out of 1 available) 7554 1726853150.35684: exiting _queue_task() for managed_node3/command 7554 1726853150.35694: done queuing things up, now waiting for results queue to drain 7554 1726853150.35696: waiting for pending results... 7554 1726853150.35847: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7554 1726853150.35905: in run() - task 02083763-bbaf-bdc3-98b6-000000000389 7554 1726853150.35917: variable 'ansible_search_path' from source: unknown 7554 1726853150.35922: variable 'ansible_search_path' from source: unknown 7554 1726853150.35952: calling self._execute() 7554 1726853150.36012: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853150.36017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853150.36025: variable 'omit' from source: magic vars 7554 1726853150.36323: variable 'ansible_distribution_major_version' from source: facts 7554 1726853150.36334: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853150.36339: variable 'omit' from source: magic vars 7554 1726853150.36375: variable 'omit' from source: magic vars 7554 1726853150.36400: variable 'omit' from source: magic vars 7554 1726853150.36429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853150.36456: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853150.36475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853150.36491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853150.36500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853150.36523: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853150.36526: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853150.36529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853150.36599: Set connection var ansible_shell_executable to /bin/sh 7554 1726853150.36606: Set connection var ansible_pipelining to False 7554 1726853150.36609: Set connection var ansible_shell_type to sh 7554 1726853150.36611: Set connection var ansible_connection to ssh 7554 1726853150.36619: Set connection var ansible_timeout to 10 7554 1726853150.36624: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853150.36640: variable 'ansible_shell_executable' from source: unknown 7554 1726853150.36643: variable 'ansible_connection' from source: unknown 7554 1726853150.36649: variable 'ansible_module_compression' from source: unknown 7554 1726853150.36651: variable 'ansible_shell_type' from source: unknown 7554 1726853150.36654: variable 'ansible_shell_executable' from source: unknown 7554 1726853150.36656: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853150.36658: variable 'ansible_pipelining' from source: unknown 7554 1726853150.36660: variable 'ansible_timeout' from source: unknown 7554 1726853150.36662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853150.36758: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853150.36766: variable 'omit' from source: magic vars 7554 1726853150.36773: starting attempt loop 7554 1726853150.36775: running the handler 7554 1726853150.36789: _low_level_execute_command(): starting 7554 1726853150.36796: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853150.37697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853150.37806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853150.37811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853150.37814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853150.37912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853150.39625: stdout chunk (state=3): >>>/root <<< 7554 1726853150.39752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853150.39778: stdout chunk (state=3): >>><<< 7554 1726853150.39798: stderr chunk (state=3): >>><<< 7554 1726853150.39918: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853150.39921: _low_level_execute_command(): starting 7554 1726853150.39924: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853150.3982046-7739-204786944762772 `" && echo ansible-tmp-1726853150.3982046-7739-204786944762772="` echo /root/.ansible/tmp/ansible-tmp-1726853150.3982046-7739-204786944762772 `" ) && sleep 0' 7554 1726853150.40493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853150.40517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853150.40530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853150.40626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853150.40665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853150.40688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853150.40778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853150.42754: stdout chunk (state=3): >>>ansible-tmp-1726853150.3982046-7739-204786944762772=/root/.ansible/tmp/ansible-tmp-1726853150.3982046-7739-204786944762772 <<< 7554 1726853150.42845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853150.42882: stderr chunk (state=3): >>><<< 7554 1726853150.42884: stdout chunk (state=3): >>><<< 7554 1726853150.42978: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853150.3982046-7739-204786944762772=/root/.ansible/tmp/ansible-tmp-1726853150.3982046-7739-204786944762772 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853150.42982: variable 'ansible_module_compression' from source: unknown 7554 1726853150.42984: ANSIBALLZ: Using generic lock for ansible.legacy.command 7554 1726853150.42986: ANSIBALLZ: Acquiring lock 7554 1726853150.42989: ANSIBALLZ: Lock acquired: 140257826526304 7554 1726853150.42990: ANSIBALLZ: Creating module 7554 1726853150.52004: ANSIBALLZ: Writing module into payload 7554 1726853150.52080: ANSIBALLZ: Writing module 7554 1726853150.52107: ANSIBALLZ: Renaming module 7554 1726853150.52111: ANSIBALLZ: Done creating module 7554 1726853150.52153: variable 'ansible_facts' from source: unknown 7554 1726853150.52187: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853150.3982046-7739-204786944762772/AnsiballZ_command.py 7554 1726853150.52576: Sending initial data 7554 1726853150.52580: Sent initial data (154 bytes) 7554 1726853150.52888: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853150.52932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853150.52963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853150.52977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853150.53061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853150.54736: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7554 1726853150.54748: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853150.54803: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853150.54868: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmppofljts6 /root/.ansible/tmp/ansible-tmp-1726853150.3982046-7739-204786944762772/AnsiballZ_command.py <<< 7554 1726853150.54874: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853150.3982046-7739-204786944762772/AnsiballZ_command.py" <<< 7554 1726853150.54942: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmppofljts6" to remote "/root/.ansible/tmp/ansible-tmp-1726853150.3982046-7739-204786944762772/AnsiballZ_command.py" <<< 7554 1726853150.54945: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853150.3982046-7739-204786944762772/AnsiballZ_command.py" <<< 7554 1726853150.55577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853150.55605: stderr chunk (state=3): >>><<< 7554 1726853150.55614: stdout chunk (state=3): >>><<< 7554 1726853150.55727: done transferring module to remote 7554 1726853150.55730: _low_level_execute_command(): starting 7554 1726853150.55733: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853150.3982046-7739-204786944762772/ /root/.ansible/tmp/ansible-tmp-1726853150.3982046-7739-204786944762772/AnsiballZ_command.py && sleep 0' 7554 1726853150.56344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853150.56391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853150.56426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853150.56430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853150.56487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853150.59043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853150.59068: stderr chunk (state=3): >>><<< 7554 1726853150.59074: stdout chunk (state=3): >>><<< 7554 1726853150.59089: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853150.59093: _low_level_execute_command(): starting 7554 1726853150.59097: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853150.3982046-7739-204786944762772/AnsiballZ_command.py && sleep 0' 7554 1726853150.59526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853150.59529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853150.59532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853150.59535: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853150.59537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853150.59582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853150.59586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853150.59662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853150.82613: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:25:50.817359", "end": "2024-09-20 13:25:50.822714", "delta": "0:00:00.005355", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853150.84914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853150.84919: stdout chunk (state=3): >>><<< 7554 1726853150.84921: stderr chunk (state=3): >>><<< 7554 1726853150.84977: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:25:50.817359", "end": "2024-09-20 13:25:50.822714", "delta": "0:00:00.005355", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853150.85076: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853150.3982046-7739-204786944762772/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853150.85080: _low_level_execute_command(): starting 7554 1726853150.85082: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853150.3982046-7739-204786944762772/ > /dev/null 2>&1 && sleep 0' 7554 1726853150.85741: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853150.85776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853150.85890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853150.85939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853150.86012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853150.88520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853150.88525: stdout chunk (state=3): >>><<< 7554 1726853150.88528: stderr chunk (state=3): >>><<< 7554 1726853150.88589: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853150.88651: handler run complete 7554 1726853150.88667: Evaluated conditional (False): False 7554 1726853150.88688: attempt loop complete, returning result 7554 1726853150.88765: _execute() done 7554 1726853150.88769: dumping result to json 7554 1726853150.88772: done dumping result, returning 7554 1726853150.88774: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [02083763-bbaf-bdc3-98b6-000000000389] 7554 1726853150.88776: sending task result for task 02083763-bbaf-bdc3-98b6-000000000389 7554 1726853150.88855: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000389 7554 1726853150.88858: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.005355", "end": "2024-09-20 13:25:50.822714", "rc": 0, "start": "2024-09-20 13:25:50.817359" } STDOUT: eth0 lo 7554 1726853150.88942: no more pending results, returning what we have 7554 1726853150.88948: results queue empty 7554 1726853150.88949: checking for any_errors_fatal 7554 1726853150.88951: done checking for any_errors_fatal 7554 1726853150.88952: checking for max_fail_percentage 7554 1726853150.88953: done checking for max_fail_percentage 7554 1726853150.88954: checking to see if all hosts have failed and the running result is not ok 7554 1726853150.88955: done checking to see if all hosts have failed 7554 1726853150.88956: getting the remaining hosts for this loop 7554 1726853150.88957: done getting the remaining hosts for this loop 7554 1726853150.88961: getting the next task for host managed_node3 7554 1726853150.88968: done getting next task for host managed_node3 7554 1726853150.88970: ^ task is: TASK: Set current_interfaces 7554 1726853150.89092: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853150.89097: getting variables 7554 1726853150.89099: in VariableManager get_vars() 7554 1726853150.89154: Calling all_inventory to load vars for managed_node3 7554 1726853150.89157: Calling groups_inventory to load vars for managed_node3 7554 1726853150.89160: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853150.89283: Calling all_plugins_play to load vars for managed_node3 7554 1726853150.89289: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853150.89293: Calling groups_plugins_play to load vars for managed_node3 7554 1726853150.89722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853150.89949: done with get_vars() 7554 1726853150.89960: done getting variables 7554 1726853150.90029: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:25:50 -0400 (0:00:00.545) 0:00:04.868 ****** 7554 1726853150.90063: entering _queue_task() for managed_node3/set_fact 7554 1726853150.90595: worker is 1 (out of 1 available) 7554 1726853150.90601: exiting _queue_task() for managed_node3/set_fact 7554 1726853150.90610: done queuing things up, now waiting for results queue to drain 7554 1726853150.90612: waiting for pending results... 7554 1726853150.91488: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7554 1726853150.91493: in run() - task 02083763-bbaf-bdc3-98b6-00000000038a 7554 1726853150.91496: variable 'ansible_search_path' from source: unknown 7554 1726853150.91499: variable 'ansible_search_path' from source: unknown 7554 1726853150.91644: calling self._execute() 7554 1726853150.91790: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853150.91801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853150.91875: variable 'omit' from source: magic vars 7554 1726853150.92778: variable 'ansible_distribution_major_version' from source: facts 7554 1726853150.92802: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853150.92817: variable 'omit' from source: magic vars 7554 1726853150.92903: variable 'omit' from source: magic vars 7554 1726853150.93025: variable '_current_interfaces' from source: set_fact 7554 1726853150.93106: variable 'omit' from source: magic vars 7554 1726853150.93205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853150.93223: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853150.93261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853150.93287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853150.93311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853150.93358: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853150.93403: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853150.93437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853150.93585: Set connection var ansible_shell_executable to /bin/sh 7554 1726853150.93599: Set connection var ansible_pipelining to False 7554 1726853150.93606: Set connection var ansible_shell_type to sh 7554 1726853150.93613: Set connection var ansible_connection to ssh 7554 1726853150.93655: Set connection var ansible_timeout to 10 7554 1726853150.93660: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853150.93689: variable 'ansible_shell_executable' from source: unknown 7554 1726853150.93738: variable 'ansible_connection' from source: unknown 7554 1726853150.93770: variable 'ansible_module_compression' from source: unknown 7554 1726853150.93775: variable 'ansible_shell_type' from source: unknown 7554 1726853150.93778: variable 'ansible_shell_executable' from source: unknown 7554 1726853150.93786: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853150.93789: variable 'ansible_pipelining' from source: unknown 7554 1726853150.93791: variable 'ansible_timeout' from source: unknown 7554 1726853150.93793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853150.93957: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853150.93961: variable 'omit' from source: magic vars 7554 1726853150.93973: starting attempt loop 7554 1726853150.94005: running the handler 7554 1726853150.94008: handler run complete 7554 1726853150.94021: attempt loop complete, returning result 7554 1726853150.94028: _execute() done 7554 1726853150.94064: dumping result to json 7554 1726853150.94068: done dumping result, returning 7554 1726853150.94073: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [02083763-bbaf-bdc3-98b6-00000000038a] 7554 1726853150.94075: sending task result for task 02083763-bbaf-bdc3-98b6-00000000038a ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 7554 1726853150.94342: no more pending results, returning what we have 7554 1726853150.94347: results queue empty 7554 1726853150.94349: checking for any_errors_fatal 7554 1726853150.94357: done checking for any_errors_fatal 7554 1726853150.94358: checking for max_fail_percentage 7554 1726853150.94360: done checking for max_fail_percentage 7554 1726853150.94361: checking to see if all hosts have failed and the running result is not ok 7554 1726853150.94362: done checking to see if all hosts have failed 7554 1726853150.94363: getting the remaining hosts for this loop 7554 1726853150.94364: done getting the remaining hosts for this loop 7554 1726853150.94369: getting the next task for host managed_node3 7554 1726853150.94379: done getting next task for host managed_node3 7554 1726853150.94501: ^ task is: TASK: Show current_interfaces 7554 1726853150.94505: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853150.94510: getting variables 7554 1726853150.94511: in VariableManager get_vars() 7554 1726853150.94561: Calling all_inventory to load vars for managed_node3 7554 1726853150.94564: Calling groups_inventory to load vars for managed_node3 7554 1726853150.94566: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853150.94614: Calling all_plugins_play to load vars for managed_node3 7554 1726853150.94618: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853150.94624: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000038a 7554 1726853150.94627: WORKER PROCESS EXITING 7554 1726853150.94630: Calling groups_plugins_play to load vars for managed_node3 7554 1726853150.94921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853150.95068: done with get_vars() 7554 1726853150.95077: done getting variables 7554 1726853150.95142: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:25:50 -0400 (0:00:00.051) 0:00:04.919 ****** 7554 1726853150.95168: entering _queue_task() for managed_node3/debug 7554 1726853150.95169: Creating lock for debug 7554 1726853150.95370: worker is 1 (out of 1 available) 7554 1726853150.95386: exiting _queue_task() for managed_node3/debug 7554 1726853150.95398: done queuing things up, now waiting for results queue to drain 7554 1726853150.95401: waiting for pending results... 7554 1726853150.95646: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7554 1726853150.95719: in run() - task 02083763-bbaf-bdc3-98b6-0000000001cb 7554 1726853150.95736: variable 'ansible_search_path' from source: unknown 7554 1726853150.95741: variable 'ansible_search_path' from source: unknown 7554 1726853150.95804: calling self._execute() 7554 1726853150.96077: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853150.96080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853150.96083: variable 'omit' from source: magic vars 7554 1726853150.96341: variable 'ansible_distribution_major_version' from source: facts 7554 1726853150.96359: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853150.96376: variable 'omit' from source: magic vars 7554 1726853150.96425: variable 'omit' from source: magic vars 7554 1726853150.96511: variable 'current_interfaces' from source: set_fact 7554 1726853150.96545: variable 'omit' from source: magic vars 7554 1726853150.96587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853150.96649: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853150.96652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853150.96657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853150.96667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853150.96725: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853150.96733: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853150.96781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853150.97103: Set connection var ansible_shell_executable to /bin/sh 7554 1726853150.97106: Set connection var ansible_pipelining to False 7554 1726853150.97109: Set connection var ansible_shell_type to sh 7554 1726853150.97111: Set connection var ansible_connection to ssh 7554 1726853150.97112: Set connection var ansible_timeout to 10 7554 1726853150.97114: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853150.97116: variable 'ansible_shell_executable' from source: unknown 7554 1726853150.97118: variable 'ansible_connection' from source: unknown 7554 1726853150.97120: variable 'ansible_module_compression' from source: unknown 7554 1726853150.97122: variable 'ansible_shell_type' from source: unknown 7554 1726853150.97124: variable 'ansible_shell_executable' from source: unknown 7554 1726853150.97126: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853150.97128: variable 'ansible_pipelining' from source: unknown 7554 1726853150.97129: variable 'ansible_timeout' from source: unknown 7554 1726853150.97131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853150.97247: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853150.97266: variable 'omit' from source: magic vars 7554 1726853150.97278: starting attempt loop 7554 1726853150.97285: running the handler 7554 1726853150.97334: handler run complete 7554 1726853150.97352: attempt loop complete, returning result 7554 1726853150.97359: _execute() done 7554 1726853150.97366: dumping result to json 7554 1726853150.97376: done dumping result, returning 7554 1726853150.97390: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [02083763-bbaf-bdc3-98b6-0000000001cb] 7554 1726853150.97400: sending task result for task 02083763-bbaf-bdc3-98b6-0000000001cb 7554 1726853150.97577: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000001cb 7554 1726853150.97580: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo'] 7554 1726853150.97707: no more pending results, returning what we have 7554 1726853150.97710: results queue empty 7554 1726853150.97711: checking for any_errors_fatal 7554 1726853150.97714: done checking for any_errors_fatal 7554 1726853150.97715: checking for max_fail_percentage 7554 1726853150.97717: done checking for max_fail_percentage 7554 1726853150.97717: checking to see if all hosts have failed and the running result is not ok 7554 1726853150.97718: done checking to see if all hosts have failed 7554 1726853150.97719: getting the remaining hosts for this loop 7554 1726853150.97720: done getting the remaining hosts for this loop 7554 1726853150.97725: getting the next task for host managed_node3 7554 1726853150.97731: done getting next task for host managed_node3 7554 1726853150.97734: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7554 1726853150.97737: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853150.97741: getting variables 7554 1726853150.97743: in VariableManager get_vars() 7554 1726853150.97889: Calling all_inventory to load vars for managed_node3 7554 1726853150.97892: Calling groups_inventory to load vars for managed_node3 7554 1726853150.97895: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853150.97905: Calling all_plugins_play to load vars for managed_node3 7554 1726853150.97908: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853150.97911: Calling groups_plugins_play to load vars for managed_node3 7554 1726853150.98070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853150.98422: done with get_vars() 7554 1726853150.98435: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:11 Friday 20 September 2024 13:25:50 -0400 (0:00:00.033) 0:00:04.952 ****** 7554 1726853150.98531: entering _queue_task() for managed_node3/include_tasks 7554 1726853150.98848: worker is 1 (out of 1 available) 7554 1726853150.98863: exiting _queue_task() for managed_node3/include_tasks 7554 1726853150.99079: done queuing things up, now waiting for results queue to drain 7554 1726853150.99081: waiting for pending results... 7554 1726853150.99151: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7554 1726853150.99277: in run() - task 02083763-bbaf-bdc3-98b6-00000000000c 7554 1726853150.99282: variable 'ansible_search_path' from source: unknown 7554 1726853150.99303: calling self._execute() 7554 1726853150.99388: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853150.99399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853150.99418: variable 'omit' from source: magic vars 7554 1726853150.99854: variable 'ansible_distribution_major_version' from source: facts 7554 1726853150.99857: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853150.99859: _execute() done 7554 1726853150.99862: dumping result to json 7554 1726853150.99864: done dumping result, returning 7554 1726853150.99866: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [02083763-bbaf-bdc3-98b6-00000000000c] 7554 1726853150.99868: sending task result for task 02083763-bbaf-bdc3-98b6-00000000000c 7554 1726853150.99931: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000000c 7554 1726853150.99934: WORKER PROCESS EXITING 7554 1726853150.99987: no more pending results, returning what we have 7554 1726853150.99992: in VariableManager get_vars() 7554 1726853151.00044: Calling all_inventory to load vars for managed_node3 7554 1726853151.00047: Calling groups_inventory to load vars for managed_node3 7554 1726853151.00049: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853151.00062: Calling all_plugins_play to load vars for managed_node3 7554 1726853151.00065: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853151.00068: Calling groups_plugins_play to load vars for managed_node3 7554 1726853151.00391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853151.00630: done with get_vars() 7554 1726853151.00638: variable 'ansible_search_path' from source: unknown 7554 1726853151.00650: we have included files to process 7554 1726853151.00651: generating all_blocks data 7554 1726853151.00652: done generating all_blocks data 7554 1726853151.00657: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7554 1726853151.00659: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7554 1726853151.00661: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7554 1726853151.01329: in VariableManager get_vars() 7554 1726853151.01354: done with get_vars() 7554 1726853151.01578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 7554 1726853151.02370: done processing included file 7554 1726853151.02373: iterating over new_blocks loaded from include file 7554 1726853151.02375: in VariableManager get_vars() 7554 1726853151.02390: done with get_vars() 7554 1726853151.02391: filtering new block on tags 7554 1726853151.02409: done filtering new block on tags 7554 1726853151.02411: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7554 1726853151.02414: extending task lists for all hosts with included blocks 7554 1726853151.04838: done extending task lists 7554 1726853151.04840: done processing included files 7554 1726853151.04841: results queue empty 7554 1726853151.04842: checking for any_errors_fatal 7554 1726853151.04847: done checking for any_errors_fatal 7554 1726853151.04848: checking for max_fail_percentage 7554 1726853151.04849: done checking for max_fail_percentage 7554 1726853151.04850: checking to see if all hosts have failed and the running result is not ok 7554 1726853151.04851: done checking to see if all hosts have failed 7554 1726853151.04852: getting the remaining hosts for this loop 7554 1726853151.04853: done getting the remaining hosts for this loop 7554 1726853151.04855: getting the next task for host managed_node3 7554 1726853151.04859: done getting next task for host managed_node3 7554 1726853151.04862: ^ task is: TASK: Ensure state in ["present", "absent"] 7554 1726853151.04864: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853151.04867: getting variables 7554 1726853151.04868: in VariableManager get_vars() 7554 1726853151.04899: Calling all_inventory to load vars for managed_node3 7554 1726853151.04902: Calling groups_inventory to load vars for managed_node3 7554 1726853151.04904: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853151.04911: Calling all_plugins_play to load vars for managed_node3 7554 1726853151.04913: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853151.04916: Calling groups_plugins_play to load vars for managed_node3 7554 1726853151.05084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853151.05326: done with get_vars() 7554 1726853151.05337: done getting variables 7554 1726853151.05412: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 13:25:51 -0400 (0:00:00.069) 0:00:05.022 ****** 7554 1726853151.05453: entering _queue_task() for managed_node3/fail 7554 1726853151.05455: Creating lock for fail 7554 1726853151.05750: worker is 1 (out of 1 available) 7554 1726853151.05763: exiting _queue_task() for managed_node3/fail 7554 1726853151.05780: done queuing things up, now waiting for results queue to drain 7554 1726853151.05782: waiting for pending results... 7554 1726853151.05960: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7554 1726853151.06033: in run() - task 02083763-bbaf-bdc3-98b6-0000000003a5 7554 1726853151.06047: variable 'ansible_search_path' from source: unknown 7554 1726853151.06051: variable 'ansible_search_path' from source: unknown 7554 1726853151.06079: calling self._execute() 7554 1726853151.06143: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.06149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.06156: variable 'omit' from source: magic vars 7554 1726853151.06439: variable 'ansible_distribution_major_version' from source: facts 7554 1726853151.06450: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853151.06540: variable 'state' from source: include params 7554 1726853151.06549: Evaluated conditional (state not in ["present", "absent"]): False 7554 1726853151.06553: when evaluation is False, skipping this task 7554 1726853151.06556: _execute() done 7554 1726853151.06558: dumping result to json 7554 1726853151.06561: done dumping result, returning 7554 1726853151.06563: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [02083763-bbaf-bdc3-98b6-0000000003a5] 7554 1726853151.06573: sending task result for task 02083763-bbaf-bdc3-98b6-0000000003a5 7554 1726853151.06656: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000003a5 7554 1726853151.06658: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7554 1726853151.06714: no more pending results, returning what we have 7554 1726853151.06717: results queue empty 7554 1726853151.06718: checking for any_errors_fatal 7554 1726853151.06719: done checking for any_errors_fatal 7554 1726853151.06720: checking for max_fail_percentage 7554 1726853151.06722: done checking for max_fail_percentage 7554 1726853151.06722: checking to see if all hosts have failed and the running result is not ok 7554 1726853151.06723: done checking to see if all hosts have failed 7554 1726853151.06724: getting the remaining hosts for this loop 7554 1726853151.06725: done getting the remaining hosts for this loop 7554 1726853151.06729: getting the next task for host managed_node3 7554 1726853151.06734: done getting next task for host managed_node3 7554 1726853151.06736: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7554 1726853151.06739: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853151.06743: getting variables 7554 1726853151.06747: in VariableManager get_vars() 7554 1726853151.06791: Calling all_inventory to load vars for managed_node3 7554 1726853151.06794: Calling groups_inventory to load vars for managed_node3 7554 1726853151.06796: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853151.06805: Calling all_plugins_play to load vars for managed_node3 7554 1726853151.06806: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853151.06809: Calling groups_plugins_play to load vars for managed_node3 7554 1726853151.06930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853151.07046: done with get_vars() 7554 1726853151.07054: done getting variables 7554 1726853151.07097: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 13:25:51 -0400 (0:00:00.016) 0:00:05.038 ****** 7554 1726853151.07117: entering _queue_task() for managed_node3/fail 7554 1726853151.07319: worker is 1 (out of 1 available) 7554 1726853151.07334: exiting _queue_task() for managed_node3/fail 7554 1726853151.07346: done queuing things up, now waiting for results queue to drain 7554 1726853151.07347: waiting for pending results... 7554 1726853151.07559: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7554 1726853151.07676: in run() - task 02083763-bbaf-bdc3-98b6-0000000003a6 7554 1726853151.07680: variable 'ansible_search_path' from source: unknown 7554 1726853151.07683: variable 'ansible_search_path' from source: unknown 7554 1726853151.07703: calling self._execute() 7554 1726853151.07793: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.07805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.07818: variable 'omit' from source: magic vars 7554 1726853151.08182: variable 'ansible_distribution_major_version' from source: facts 7554 1726853151.08204: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853151.08400: variable 'type' from source: play vars 7554 1726853151.08404: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7554 1726853151.08406: when evaluation is False, skipping this task 7554 1726853151.08408: _execute() done 7554 1726853151.08416: dumping result to json 7554 1726853151.08419: done dumping result, returning 7554 1726853151.08421: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [02083763-bbaf-bdc3-98b6-0000000003a6] 7554 1726853151.08422: sending task result for task 02083763-bbaf-bdc3-98b6-0000000003a6 skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7554 1726853151.08536: no more pending results, returning what we have 7554 1726853151.08544: results queue empty 7554 1726853151.08546: checking for any_errors_fatal 7554 1726853151.08553: done checking for any_errors_fatal 7554 1726853151.08554: checking for max_fail_percentage 7554 1726853151.08556: done checking for max_fail_percentage 7554 1726853151.08557: checking to see if all hosts have failed and the running result is not ok 7554 1726853151.08558: done checking to see if all hosts have failed 7554 1726853151.08559: getting the remaining hosts for this loop 7554 1726853151.08560: done getting the remaining hosts for this loop 7554 1726853151.08564: getting the next task for host managed_node3 7554 1726853151.08572: done getting next task for host managed_node3 7554 1726853151.08576: ^ task is: TASK: Include the task 'show_interfaces.yml' 7554 1726853151.08580: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853151.08585: getting variables 7554 1726853151.08587: in VariableManager get_vars() 7554 1726853151.08642: Calling all_inventory to load vars for managed_node3 7554 1726853151.08644: Calling groups_inventory to load vars for managed_node3 7554 1726853151.08647: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853151.08657: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000003a6 7554 1726853151.08659: WORKER PROCESS EXITING 7554 1726853151.08669: Calling all_plugins_play to load vars for managed_node3 7554 1726853151.08674: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853151.08677: Calling groups_plugins_play to load vars for managed_node3 7554 1726853151.08914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853151.09120: done with get_vars() 7554 1726853151.09129: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 13:25:51 -0400 (0:00:00.021) 0:00:05.059 ****** 7554 1726853151.09225: entering _queue_task() for managed_node3/include_tasks 7554 1726853151.09506: worker is 1 (out of 1 available) 7554 1726853151.09519: exiting _queue_task() for managed_node3/include_tasks 7554 1726853151.09530: done queuing things up, now waiting for results queue to drain 7554 1726853151.09532: waiting for pending results... 7554 1726853151.09684: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7554 1726853151.09755: in run() - task 02083763-bbaf-bdc3-98b6-0000000003a7 7554 1726853151.09765: variable 'ansible_search_path' from source: unknown 7554 1726853151.09769: variable 'ansible_search_path' from source: unknown 7554 1726853151.09797: calling self._execute() 7554 1726853151.09862: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.09867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.09944: variable 'omit' from source: magic vars 7554 1726853151.10141: variable 'ansible_distribution_major_version' from source: facts 7554 1726853151.10150: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853151.10162: _execute() done 7554 1726853151.10165: dumping result to json 7554 1726853151.10168: done dumping result, returning 7554 1726853151.10173: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-bdc3-98b6-0000000003a7] 7554 1726853151.10179: sending task result for task 02083763-bbaf-bdc3-98b6-0000000003a7 7554 1726853151.10263: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000003a7 7554 1726853151.10266: WORKER PROCESS EXITING 7554 1726853151.10327: no more pending results, returning what we have 7554 1726853151.10331: in VariableManager get_vars() 7554 1726853151.10387: Calling all_inventory to load vars for managed_node3 7554 1726853151.10389: Calling groups_inventory to load vars for managed_node3 7554 1726853151.10391: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853151.10397: Calling all_plugins_play to load vars for managed_node3 7554 1726853151.10399: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853151.10401: Calling groups_plugins_play to load vars for managed_node3 7554 1726853151.10505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853151.10617: done with get_vars() 7554 1726853151.10624: variable 'ansible_search_path' from source: unknown 7554 1726853151.10625: variable 'ansible_search_path' from source: unknown 7554 1726853151.10647: we have included files to process 7554 1726853151.10648: generating all_blocks data 7554 1726853151.10649: done generating all_blocks data 7554 1726853151.10652: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7554 1726853151.10653: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7554 1726853151.10654: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7554 1726853151.10722: in VariableManager get_vars() 7554 1726853151.10740: done with get_vars() 7554 1726853151.10815: done processing included file 7554 1726853151.10817: iterating over new_blocks loaded from include file 7554 1726853151.10818: in VariableManager get_vars() 7554 1726853151.10833: done with get_vars() 7554 1726853151.10834: filtering new block on tags 7554 1726853151.10844: done filtering new block on tags 7554 1726853151.10846: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7554 1726853151.10850: extending task lists for all hosts with included blocks 7554 1726853151.11101: done extending task lists 7554 1726853151.11102: done processing included files 7554 1726853151.11102: results queue empty 7554 1726853151.11103: checking for any_errors_fatal 7554 1726853151.11105: done checking for any_errors_fatal 7554 1726853151.11106: checking for max_fail_percentage 7554 1726853151.11106: done checking for max_fail_percentage 7554 1726853151.11107: checking to see if all hosts have failed and the running result is not ok 7554 1726853151.11107: done checking to see if all hosts have failed 7554 1726853151.11108: getting the remaining hosts for this loop 7554 1726853151.11108: done getting the remaining hosts for this loop 7554 1726853151.11110: getting the next task for host managed_node3 7554 1726853151.11112: done getting next task for host managed_node3 7554 1726853151.11114: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7554 1726853151.11116: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853151.11118: getting variables 7554 1726853151.11118: in VariableManager get_vars() 7554 1726853151.11128: Calling all_inventory to load vars for managed_node3 7554 1726853151.11129: Calling groups_inventory to load vars for managed_node3 7554 1726853151.11131: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853151.11134: Calling all_plugins_play to load vars for managed_node3 7554 1726853151.11135: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853151.11137: Calling groups_plugins_play to load vars for managed_node3 7554 1726853151.11217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853151.11348: done with get_vars() 7554 1726853151.11356: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:25:51 -0400 (0:00:00.021) 0:00:05.081 ****** 7554 1726853151.11423: entering _queue_task() for managed_node3/include_tasks 7554 1726853151.11660: worker is 1 (out of 1 available) 7554 1726853151.11675: exiting _queue_task() for managed_node3/include_tasks 7554 1726853151.11686: done queuing things up, now waiting for results queue to drain 7554 1726853151.11688: waiting for pending results... 7554 1726853151.11994: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7554 1726853151.12037: in run() - task 02083763-bbaf-bdc3-98b6-00000000057e 7554 1726853151.12059: variable 'ansible_search_path' from source: unknown 7554 1726853151.12068: variable 'ansible_search_path' from source: unknown 7554 1726853151.12117: calling self._execute() 7554 1726853151.12206: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.12219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.12235: variable 'omit' from source: magic vars 7554 1726853151.12540: variable 'ansible_distribution_major_version' from source: facts 7554 1726853151.12552: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853151.12565: _execute() done 7554 1726853151.12570: dumping result to json 7554 1726853151.12575: done dumping result, returning 7554 1726853151.12578: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-bdc3-98b6-00000000057e] 7554 1726853151.12580: sending task result for task 02083763-bbaf-bdc3-98b6-00000000057e 7554 1726853151.12664: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000057e 7554 1726853151.12705: no more pending results, returning what we have 7554 1726853151.12710: in VariableManager get_vars() 7554 1726853151.12758: Calling all_inventory to load vars for managed_node3 7554 1726853151.12761: Calling groups_inventory to load vars for managed_node3 7554 1726853151.12763: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853151.12774: Calling all_plugins_play to load vars for managed_node3 7554 1726853151.12776: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853151.12786: WORKER PROCESS EXITING 7554 1726853151.12792: Calling groups_plugins_play to load vars for managed_node3 7554 1726853151.12931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853151.13050: done with get_vars() 7554 1726853151.13056: variable 'ansible_search_path' from source: unknown 7554 1726853151.13057: variable 'ansible_search_path' from source: unknown 7554 1726853151.13095: we have included files to process 7554 1726853151.13096: generating all_blocks data 7554 1726853151.13097: done generating all_blocks data 7554 1726853151.13098: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7554 1726853151.13099: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7554 1726853151.13101: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7554 1726853151.13278: done processing included file 7554 1726853151.13280: iterating over new_blocks loaded from include file 7554 1726853151.13281: in VariableManager get_vars() 7554 1726853151.13296: done with get_vars() 7554 1726853151.13297: filtering new block on tags 7554 1726853151.13308: done filtering new block on tags 7554 1726853151.13309: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7554 1726853151.13312: extending task lists for all hosts with included blocks 7554 1726853151.13402: done extending task lists 7554 1726853151.13403: done processing included files 7554 1726853151.13404: results queue empty 7554 1726853151.13404: checking for any_errors_fatal 7554 1726853151.13406: done checking for any_errors_fatal 7554 1726853151.13406: checking for max_fail_percentage 7554 1726853151.13407: done checking for max_fail_percentage 7554 1726853151.13407: checking to see if all hosts have failed and the running result is not ok 7554 1726853151.13408: done checking to see if all hosts have failed 7554 1726853151.13409: getting the remaining hosts for this loop 7554 1726853151.13409: done getting the remaining hosts for this loop 7554 1726853151.13411: getting the next task for host managed_node3 7554 1726853151.13414: done getting next task for host managed_node3 7554 1726853151.13415: ^ task is: TASK: Gather current interface info 7554 1726853151.13417: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853151.13419: getting variables 7554 1726853151.13419: in VariableManager get_vars() 7554 1726853151.13431: Calling all_inventory to load vars for managed_node3 7554 1726853151.13432: Calling groups_inventory to load vars for managed_node3 7554 1726853151.13434: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853151.13439: Calling all_plugins_play to load vars for managed_node3 7554 1726853151.13441: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853151.13442: Calling groups_plugins_play to load vars for managed_node3 7554 1726853151.13524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853151.13634: done with get_vars() 7554 1726853151.13640: done getting variables 7554 1726853151.13673: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:25:51 -0400 (0:00:00.022) 0:00:05.104 ****** 7554 1726853151.13693: entering _queue_task() for managed_node3/command 7554 1726853151.13897: worker is 1 (out of 1 available) 7554 1726853151.13910: exiting _queue_task() for managed_node3/command 7554 1726853151.13921: done queuing things up, now waiting for results queue to drain 7554 1726853151.13922: waiting for pending results... 7554 1726853151.14077: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7554 1726853151.14151: in run() - task 02083763-bbaf-bdc3-98b6-0000000005b5 7554 1726853151.14166: variable 'ansible_search_path' from source: unknown 7554 1726853151.14170: variable 'ansible_search_path' from source: unknown 7554 1726853151.14192: calling self._execute() 7554 1726853151.14256: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.14262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.14270: variable 'omit' from source: magic vars 7554 1726853151.14781: variable 'ansible_distribution_major_version' from source: facts 7554 1726853151.14791: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853151.14797: variable 'omit' from source: magic vars 7554 1726853151.14835: variable 'omit' from source: magic vars 7554 1726853151.14859: variable 'omit' from source: magic vars 7554 1726853151.14890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853151.14918: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853151.14935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853151.14951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853151.14959: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853151.14983: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853151.14987: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.14989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.15059: Set connection var ansible_shell_executable to /bin/sh 7554 1726853151.15066: Set connection var ansible_pipelining to False 7554 1726853151.15068: Set connection var ansible_shell_type to sh 7554 1726853151.15073: Set connection var ansible_connection to ssh 7554 1726853151.15081: Set connection var ansible_timeout to 10 7554 1726853151.15085: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853151.15103: variable 'ansible_shell_executable' from source: unknown 7554 1726853151.15106: variable 'ansible_connection' from source: unknown 7554 1726853151.15110: variable 'ansible_module_compression' from source: unknown 7554 1726853151.15112: variable 'ansible_shell_type' from source: unknown 7554 1726853151.15115: variable 'ansible_shell_executable' from source: unknown 7554 1726853151.15117: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.15119: variable 'ansible_pipelining' from source: unknown 7554 1726853151.15122: variable 'ansible_timeout' from source: unknown 7554 1726853151.15125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.15221: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853151.15229: variable 'omit' from source: magic vars 7554 1726853151.15235: starting attempt loop 7554 1726853151.15237: running the handler 7554 1726853151.15255: _low_level_execute_command(): starting 7554 1726853151.15259: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853151.15781: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853151.15785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853151.15789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853151.15792: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853151.15794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853151.15849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853151.15852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853151.15855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853151.15938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7554 1726853151.18276: stdout chunk (state=3): >>>/root <<< 7554 1726853151.18422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853151.18453: stderr chunk (state=3): >>><<< 7554 1726853151.18456: stdout chunk (state=3): >>><<< 7554 1726853151.18479: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7554 1726853151.18491: _low_level_execute_command(): starting 7554 1726853151.18499: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853151.1847901-7776-121541784345521 `" && echo ansible-tmp-1726853151.1847901-7776-121541784345521="` echo /root/.ansible/tmp/ansible-tmp-1726853151.1847901-7776-121541784345521 `" ) && sleep 0' 7554 1726853151.18952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853151.18955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853151.18965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853151.18968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853151.19023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853151.19026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853151.19028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853151.19088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7554 1726853151.21849: stdout chunk (state=3): >>>ansible-tmp-1726853151.1847901-7776-121541784345521=/root/.ansible/tmp/ansible-tmp-1726853151.1847901-7776-121541784345521 <<< 7554 1726853151.22076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853151.22079: stdout chunk (state=3): >>><<< 7554 1726853151.22081: stderr chunk (state=3): >>><<< 7554 1726853151.22178: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853151.1847901-7776-121541784345521=/root/.ansible/tmp/ansible-tmp-1726853151.1847901-7776-121541784345521 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7554 1726853151.22182: variable 'ansible_module_compression' from source: unknown 7554 1726853151.22209: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853151.22252: variable 'ansible_facts' from source: unknown 7554 1726853151.22354: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853151.1847901-7776-121541784345521/AnsiballZ_command.py 7554 1726853151.22540: Sending initial data 7554 1726853151.22543: Sent initial data (154 bytes) 7554 1726853151.23150: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853151.23167: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853151.23278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853151.23299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853151.23320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853151.23337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853151.23358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853151.23458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7554 1726853151.25927: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853151.25932: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853151.26008: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp9uqjcvew /root/.ansible/tmp/ansible-tmp-1726853151.1847901-7776-121541784345521/AnsiballZ_command.py <<< 7554 1726853151.26013: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853151.1847901-7776-121541784345521/AnsiballZ_command.py" <<< 7554 1726853151.26146: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp9uqjcvew" to remote "/root/.ansible/tmp/ansible-tmp-1726853151.1847901-7776-121541784345521/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853151.1847901-7776-121541784345521/AnsiballZ_command.py" <<< 7554 1726853151.27513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853151.27590: stderr chunk (state=3): >>><<< 7554 1726853151.27649: stdout chunk (state=3): >>><<< 7554 1726853151.27659: done transferring module to remote 7554 1726853151.27677: _low_level_execute_command(): starting 7554 1726853151.27686: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853151.1847901-7776-121541784345521/ /root/.ansible/tmp/ansible-tmp-1726853151.1847901-7776-121541784345521/AnsiballZ_command.py && sleep 0' 7554 1726853151.28334: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853151.28349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853151.28364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853151.28387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853151.28424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853151.28443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853151.28485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853151.28541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853151.28559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853151.28580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853151.28678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7554 1726853151.31241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853151.31288: stderr chunk (state=3): >>><<< 7554 1726853151.31291: stdout chunk (state=3): >>><<< 7554 1726853151.31305: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7554 1726853151.31307: _low_level_execute_command(): starting 7554 1726853151.31313: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853151.1847901-7776-121541784345521/AnsiballZ_command.py && sleep 0' 7554 1726853151.31755: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853151.31759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853151.31761: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853151.31763: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853151.31765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853151.31816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853151.31821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853151.31888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7554 1726853151.48275: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:25:51.477501", "end": "2024-09-20 13:25:51.480846", "delta": "0:00:00.003345", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853151.49904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853151.49937: stderr chunk (state=3): >>><<< 7554 1726853151.49941: stdout chunk (state=3): >>><<< 7554 1726853151.49956: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:25:51.477501", "end": "2024-09-20 13:25:51.480846", "delta": "0:00:00.003345", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853151.49987: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853151.1847901-7776-121541784345521/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853151.49994: _low_level_execute_command(): starting 7554 1726853151.49999: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853151.1847901-7776-121541784345521/ > /dev/null 2>&1 && sleep 0' 7554 1726853151.50443: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853151.50447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853151.50477: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853151.50524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853151.50528: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853151.50534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853151.50598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7554 1726853151.52501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853151.52531: stderr chunk (state=3): >>><<< 7554 1726853151.52534: stdout chunk (state=3): >>><<< 7554 1726853151.52546: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7554 1726853151.52555: handler run complete 7554 1726853151.52573: Evaluated conditional (False): False 7554 1726853151.52582: attempt loop complete, returning result 7554 1726853151.52585: _execute() done 7554 1726853151.52587: dumping result to json 7554 1726853151.52592: done dumping result, returning 7554 1726853151.52599: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [02083763-bbaf-bdc3-98b6-0000000005b5] 7554 1726853151.52604: sending task result for task 02083763-bbaf-bdc3-98b6-0000000005b5 7554 1726853151.52703: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000005b5 7554 1726853151.52706: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003345", "end": "2024-09-20 13:25:51.480846", "rc": 0, "start": "2024-09-20 13:25:51.477501" } STDOUT: eth0 lo 7554 1726853151.53064: no more pending results, returning what we have 7554 1726853151.53067: results queue empty 7554 1726853151.53067: checking for any_errors_fatal 7554 1726853151.53068: done checking for any_errors_fatal 7554 1726853151.53068: checking for max_fail_percentage 7554 1726853151.53070: done checking for max_fail_percentage 7554 1726853151.53070: checking to see if all hosts have failed and the running result is not ok 7554 1726853151.53072: done checking to see if all hosts have failed 7554 1726853151.53073: getting the remaining hosts for this loop 7554 1726853151.53074: done getting the remaining hosts for this loop 7554 1726853151.53076: getting the next task for host managed_node3 7554 1726853151.53080: done getting next task for host managed_node3 7554 1726853151.53082: ^ task is: TASK: Set current_interfaces 7554 1726853151.53086: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853151.53088: getting variables 7554 1726853151.53089: in VariableManager get_vars() 7554 1726853151.53113: Calling all_inventory to load vars for managed_node3 7554 1726853151.53114: Calling groups_inventory to load vars for managed_node3 7554 1726853151.53115: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853151.53123: Calling all_plugins_play to load vars for managed_node3 7554 1726853151.53124: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853151.53126: Calling groups_plugins_play to load vars for managed_node3 7554 1726853151.53222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853151.53338: done with get_vars() 7554 1726853151.53348: done getting variables 7554 1726853151.53394: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:25:51 -0400 (0:00:00.397) 0:00:05.501 ****** 7554 1726853151.53415: entering _queue_task() for managed_node3/set_fact 7554 1726853151.53622: worker is 1 (out of 1 available) 7554 1726853151.53634: exiting _queue_task() for managed_node3/set_fact 7554 1726853151.53649: done queuing things up, now waiting for results queue to drain 7554 1726853151.53651: waiting for pending results... 7554 1726853151.53806: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7554 1726853151.53882: in run() - task 02083763-bbaf-bdc3-98b6-0000000005b6 7554 1726853151.53894: variable 'ansible_search_path' from source: unknown 7554 1726853151.53898: variable 'ansible_search_path' from source: unknown 7554 1726853151.53926: calling self._execute() 7554 1726853151.54002: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.54005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.54013: variable 'omit' from source: magic vars 7554 1726853151.54296: variable 'ansible_distribution_major_version' from source: facts 7554 1726853151.54308: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853151.54311: variable 'omit' from source: magic vars 7554 1726853151.54351: variable 'omit' from source: magic vars 7554 1726853151.54431: variable '_current_interfaces' from source: set_fact 7554 1726853151.54475: variable 'omit' from source: magic vars 7554 1726853151.54505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853151.54534: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853151.54552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853151.54566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853151.54577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853151.54600: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853151.54604: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.54607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.54679: Set connection var ansible_shell_executable to /bin/sh 7554 1726853151.54686: Set connection var ansible_pipelining to False 7554 1726853151.54689: Set connection var ansible_shell_type to sh 7554 1726853151.54692: Set connection var ansible_connection to ssh 7554 1726853151.54699: Set connection var ansible_timeout to 10 7554 1726853151.54704: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853151.54721: variable 'ansible_shell_executable' from source: unknown 7554 1726853151.54724: variable 'ansible_connection' from source: unknown 7554 1726853151.54727: variable 'ansible_module_compression' from source: unknown 7554 1726853151.54730: variable 'ansible_shell_type' from source: unknown 7554 1726853151.54732: variable 'ansible_shell_executable' from source: unknown 7554 1726853151.54735: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.54737: variable 'ansible_pipelining' from source: unknown 7554 1726853151.54739: variable 'ansible_timeout' from source: unknown 7554 1726853151.54744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.54843: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853151.54852: variable 'omit' from source: magic vars 7554 1726853151.54857: starting attempt loop 7554 1726853151.54860: running the handler 7554 1726853151.54874: handler run complete 7554 1726853151.54882: attempt loop complete, returning result 7554 1726853151.54885: _execute() done 7554 1726853151.54888: dumping result to json 7554 1726853151.54890: done dumping result, returning 7554 1726853151.54896: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [02083763-bbaf-bdc3-98b6-0000000005b6] 7554 1726853151.54901: sending task result for task 02083763-bbaf-bdc3-98b6-0000000005b6 7554 1726853151.54978: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000005b6 7554 1726853151.54981: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 7554 1726853151.55037: no more pending results, returning what we have 7554 1726853151.55040: results queue empty 7554 1726853151.55041: checking for any_errors_fatal 7554 1726853151.55051: done checking for any_errors_fatal 7554 1726853151.55051: checking for max_fail_percentage 7554 1726853151.55053: done checking for max_fail_percentage 7554 1726853151.55053: checking to see if all hosts have failed and the running result is not ok 7554 1726853151.55054: done checking to see if all hosts have failed 7554 1726853151.55055: getting the remaining hosts for this loop 7554 1726853151.55057: done getting the remaining hosts for this loop 7554 1726853151.55061: getting the next task for host managed_node3 7554 1726853151.55068: done getting next task for host managed_node3 7554 1726853151.55072: ^ task is: TASK: Show current_interfaces 7554 1726853151.55076: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853151.55079: getting variables 7554 1726853151.55081: in VariableManager get_vars() 7554 1726853151.55122: Calling all_inventory to load vars for managed_node3 7554 1726853151.55124: Calling groups_inventory to load vars for managed_node3 7554 1726853151.55126: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853151.55135: Calling all_plugins_play to load vars for managed_node3 7554 1726853151.55137: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853151.55140: Calling groups_plugins_play to load vars for managed_node3 7554 1726853151.55262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853151.55408: done with get_vars() 7554 1726853151.55415: done getting variables 7554 1726853151.55453: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:25:51 -0400 (0:00:00.020) 0:00:05.522 ****** 7554 1726853151.55476: entering _queue_task() for managed_node3/debug 7554 1726853151.55664: worker is 1 (out of 1 available) 7554 1726853151.55679: exiting _queue_task() for managed_node3/debug 7554 1726853151.55692: done queuing things up, now waiting for results queue to drain 7554 1726853151.55694: waiting for pending results... 7554 1726853151.55853: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7554 1726853151.55916: in run() - task 02083763-bbaf-bdc3-98b6-00000000057f 7554 1726853151.55927: variable 'ansible_search_path' from source: unknown 7554 1726853151.55930: variable 'ansible_search_path' from source: unknown 7554 1726853151.55963: calling self._execute() 7554 1726853151.56032: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.56036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.56050: variable 'omit' from source: magic vars 7554 1726853151.56329: variable 'ansible_distribution_major_version' from source: facts 7554 1726853151.56340: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853151.56348: variable 'omit' from source: magic vars 7554 1726853151.56382: variable 'omit' from source: magic vars 7554 1726853151.56448: variable 'current_interfaces' from source: set_fact 7554 1726853151.56469: variable 'omit' from source: magic vars 7554 1726853151.56501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853151.56535: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853151.56547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853151.56560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853151.56569: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853151.56595: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853151.56599: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.56601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.56669: Set connection var ansible_shell_executable to /bin/sh 7554 1726853151.56677: Set connection var ansible_pipelining to False 7554 1726853151.56681: Set connection var ansible_shell_type to sh 7554 1726853151.56683: Set connection var ansible_connection to ssh 7554 1726853151.56693: Set connection var ansible_timeout to 10 7554 1726853151.56697: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853151.56715: variable 'ansible_shell_executable' from source: unknown 7554 1726853151.56719: variable 'ansible_connection' from source: unknown 7554 1726853151.56721: variable 'ansible_module_compression' from source: unknown 7554 1726853151.56723: variable 'ansible_shell_type' from source: unknown 7554 1726853151.56726: variable 'ansible_shell_executable' from source: unknown 7554 1726853151.56728: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.56730: variable 'ansible_pipelining' from source: unknown 7554 1726853151.56732: variable 'ansible_timeout' from source: unknown 7554 1726853151.56737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.56835: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853151.56843: variable 'omit' from source: magic vars 7554 1726853151.56849: starting attempt loop 7554 1726853151.56852: running the handler 7554 1726853151.56888: handler run complete 7554 1726853151.56897: attempt loop complete, returning result 7554 1726853151.56905: _execute() done 7554 1726853151.56908: dumping result to json 7554 1726853151.56911: done dumping result, returning 7554 1726853151.56913: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [02083763-bbaf-bdc3-98b6-00000000057f] 7554 1726853151.56917: sending task result for task 02083763-bbaf-bdc3-98b6-00000000057f 7554 1726853151.56996: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000057f 7554 1726853151.56999: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo'] 7554 1726853151.57073: no more pending results, returning what we have 7554 1726853151.57076: results queue empty 7554 1726853151.57076: checking for any_errors_fatal 7554 1726853151.57080: done checking for any_errors_fatal 7554 1726853151.57081: checking for max_fail_percentage 7554 1726853151.57082: done checking for max_fail_percentage 7554 1726853151.57083: checking to see if all hosts have failed and the running result is not ok 7554 1726853151.57083: done checking to see if all hosts have failed 7554 1726853151.57084: getting the remaining hosts for this loop 7554 1726853151.57085: done getting the remaining hosts for this loop 7554 1726853151.57090: getting the next task for host managed_node3 7554 1726853151.57096: done getting next task for host managed_node3 7554 1726853151.57098: ^ task is: TASK: Install iproute 7554 1726853151.57101: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853151.57104: getting variables 7554 1726853151.57105: in VariableManager get_vars() 7554 1726853151.57144: Calling all_inventory to load vars for managed_node3 7554 1726853151.57148: Calling groups_inventory to load vars for managed_node3 7554 1726853151.57151: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853151.57159: Calling all_plugins_play to load vars for managed_node3 7554 1726853151.57161: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853151.57164: Calling groups_plugins_play to load vars for managed_node3 7554 1726853151.57277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853151.57396: done with get_vars() 7554 1726853151.57404: done getting variables 7554 1726853151.57441: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 13:25:51 -0400 (0:00:00.019) 0:00:05.542 ****** 7554 1726853151.57463: entering _queue_task() for managed_node3/package 7554 1726853151.57652: worker is 1 (out of 1 available) 7554 1726853151.57666: exiting _queue_task() for managed_node3/package 7554 1726853151.57681: done queuing things up, now waiting for results queue to drain 7554 1726853151.57683: waiting for pending results... 7554 1726853151.57829: running TaskExecutor() for managed_node3/TASK: Install iproute 7554 1726853151.57888: in run() - task 02083763-bbaf-bdc3-98b6-0000000003a8 7554 1726853151.57899: variable 'ansible_search_path' from source: unknown 7554 1726853151.57904: variable 'ansible_search_path' from source: unknown 7554 1726853151.57934: calling self._execute() 7554 1726853151.58005: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.58009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.58018: variable 'omit' from source: magic vars 7554 1726853151.58293: variable 'ansible_distribution_major_version' from source: facts 7554 1726853151.58304: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853151.58310: variable 'omit' from source: magic vars 7554 1726853151.58333: variable 'omit' from source: magic vars 7554 1726853151.58494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853151.59933: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853151.60137: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853151.60142: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853151.60145: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853151.60147: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853151.60187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853151.60229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853151.60257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853151.60360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853151.60364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853151.60407: variable '__network_is_ostree' from source: set_fact 7554 1726853151.60410: variable 'omit' from source: magic vars 7554 1726853151.60442: variable 'omit' from source: magic vars 7554 1726853151.60474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853151.60507: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853151.60573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853151.60577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853151.60579: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853151.60582: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853151.60584: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.60586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.60687: Set connection var ansible_shell_executable to /bin/sh 7554 1726853151.60694: Set connection var ansible_pipelining to False 7554 1726853151.60697: Set connection var ansible_shell_type to sh 7554 1726853151.60700: Set connection var ansible_connection to ssh 7554 1726853151.60709: Set connection var ansible_timeout to 10 7554 1726853151.60714: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853151.60737: variable 'ansible_shell_executable' from source: unknown 7554 1726853151.60740: variable 'ansible_connection' from source: unknown 7554 1726853151.60743: variable 'ansible_module_compression' from source: unknown 7554 1726853151.60746: variable 'ansible_shell_type' from source: unknown 7554 1726853151.60748: variable 'ansible_shell_executable' from source: unknown 7554 1726853151.60776: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853151.60779: variable 'ansible_pipelining' from source: unknown 7554 1726853151.60835: variable 'ansible_timeout' from source: unknown 7554 1726853151.60838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853151.60854: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853151.60864: variable 'omit' from source: magic vars 7554 1726853151.60869: starting attempt loop 7554 1726853151.60874: running the handler 7554 1726853151.60895: variable 'ansible_facts' from source: unknown 7554 1726853151.60898: variable 'ansible_facts' from source: unknown 7554 1726853151.60942: _low_level_execute_command(): starting 7554 1726853151.60945: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853151.61549: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853151.61568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853151.61584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853151.61636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853151.61643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853151.61645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853151.61717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7554 1726853151.63420: stdout chunk (state=3): >>>/root <<< 7554 1726853151.63535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853151.63580: stderr chunk (state=3): >>><<< 7554 1726853151.63584: stdout chunk (state=3): >>><<< 7554 1726853151.63597: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7554 1726853151.63609: _low_level_execute_command(): starting 7554 1726853151.63615: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853151.6359742-7800-166194835501000 `" && echo ansible-tmp-1726853151.6359742-7800-166194835501000="` echo /root/.ansible/tmp/ansible-tmp-1726853151.6359742-7800-166194835501000 `" ) && sleep 0' 7554 1726853151.64174: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853151.64178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853151.64180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853151.64182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853151.64184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853151.64186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853151.64264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853151.64340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7554 1726853151.66339: stdout chunk (state=3): >>>ansible-tmp-1726853151.6359742-7800-166194835501000=/root/.ansible/tmp/ansible-tmp-1726853151.6359742-7800-166194835501000 <<< 7554 1726853151.66447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853151.66475: stderr chunk (state=3): >>><<< 7554 1726853151.66479: stdout chunk (state=3): >>><<< 7554 1726853151.66494: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853151.6359742-7800-166194835501000=/root/.ansible/tmp/ansible-tmp-1726853151.6359742-7800-166194835501000 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7554 1726853151.66525: variable 'ansible_module_compression' from source: unknown 7554 1726853151.66568: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 7554 1726853151.66574: ANSIBALLZ: Acquiring lock 7554 1726853151.66576: ANSIBALLZ: Lock acquired: 140257826526304 7554 1726853151.66578: ANSIBALLZ: Creating module 7554 1726853151.78168: ANSIBALLZ: Writing module into payload 7554 1726853151.78311: ANSIBALLZ: Writing module 7554 1726853151.78330: ANSIBALLZ: Renaming module 7554 1726853151.78341: ANSIBALLZ: Done creating module 7554 1726853151.78360: variable 'ansible_facts' from source: unknown 7554 1726853151.78424: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853151.6359742-7800-166194835501000/AnsiballZ_dnf.py 7554 1726853151.78536: Sending initial data 7554 1726853151.78539: Sent initial data (150 bytes) 7554 1726853151.78976: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853151.78995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853151.78998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853151.79010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853151.79058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853151.79072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853151.79144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7554 1726853151.81419: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 7554 1726853151.81423: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853151.81481: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853151.81544: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpmu3f7cil /root/.ansible/tmp/ansible-tmp-1726853151.6359742-7800-166194835501000/AnsiballZ_dnf.py <<< 7554 1726853151.81548: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853151.6359742-7800-166194835501000/AnsiballZ_dnf.py" <<< 7554 1726853151.81602: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpmu3f7cil" to remote "/root/.ansible/tmp/ansible-tmp-1726853151.6359742-7800-166194835501000/AnsiballZ_dnf.py" <<< 7554 1726853151.81610: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853151.6359742-7800-166194835501000/AnsiballZ_dnf.py" <<< 7554 1726853151.82436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853151.82463: stderr chunk (state=3): >>><<< 7554 1726853151.82473: stdout chunk (state=3): >>><<< 7554 1726853151.82564: done transferring module to remote 7554 1726853151.82568: _low_level_execute_command(): starting 7554 1726853151.82570: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853151.6359742-7800-166194835501000/ /root/.ansible/tmp/ansible-tmp-1726853151.6359742-7800-166194835501000/AnsiballZ_dnf.py && sleep 0' 7554 1726853151.83152: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853151.83159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853151.83173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853151.83188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853151.83293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853151.83297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853151.83299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853151.83321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853151.83413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7554 1726853151.86027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853151.86030: stderr chunk (state=3): >>><<< 7554 1726853151.86033: stdout chunk (state=3): >>><<< 7554 1726853151.86058: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7554 1726853151.86062: _low_level_execute_command(): starting 7554 1726853151.86064: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853151.6359742-7800-166194835501000/AnsiballZ_dnf.py && sleep 0' 7554 1726853151.86794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853151.86804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853151.86806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853151.86858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853154.80914: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7554 1726853154.85278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853154.85302: stderr chunk (state=3): >>><<< 7554 1726853154.85311: stdout chunk (state=3): >>><<< 7554 1726853154.85336: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853154.85392: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853151.6359742-7800-166194835501000/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853154.85468: _low_level_execute_command(): starting 7554 1726853154.85475: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853151.6359742-7800-166194835501000/ > /dev/null 2>&1 && sleep 0' 7554 1726853154.86094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853154.86126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853154.86152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853154.86221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853154.88167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853154.88172: stdout chunk (state=3): >>><<< 7554 1726853154.88175: stderr chunk (state=3): >>><<< 7554 1726853154.88191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853154.88372: handler run complete 7554 1726853154.88377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853154.88613: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853154.88642: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853154.88678: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853154.88709: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853154.88762: variable '__install_status' from source: unknown 7554 1726853154.88780: Evaluated conditional (__install_status is success): True 7554 1726853154.88793: attempt loop complete, returning result 7554 1726853154.88797: _execute() done 7554 1726853154.88801: dumping result to json 7554 1726853154.88815: done dumping result, returning 7554 1726853154.88824: done running TaskExecutor() for managed_node3/TASK: Install iproute [02083763-bbaf-bdc3-98b6-0000000003a8] 7554 1726853154.88829: sending task result for task 02083763-bbaf-bdc3-98b6-0000000003a8 ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7554 1726853154.88973: no more pending results, returning what we have 7554 1726853154.88977: results queue empty 7554 1726853154.88978: checking for any_errors_fatal 7554 1726853154.88982: done checking for any_errors_fatal 7554 1726853154.88982: checking for max_fail_percentage 7554 1726853154.88984: done checking for max_fail_percentage 7554 1726853154.88985: checking to see if all hosts have failed and the running result is not ok 7554 1726853154.88986: done checking to see if all hosts have failed 7554 1726853154.88987: getting the remaining hosts for this loop 7554 1726853154.88988: done getting the remaining hosts for this loop 7554 1726853154.88992: getting the next task for host managed_node3 7554 1726853154.88997: done getting next task for host managed_node3 7554 1726853154.89000: ^ task is: TASK: Create veth interface {{ interface }} 7554 1726853154.89003: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853154.89007: getting variables 7554 1726853154.89008: in VariableManager get_vars() 7554 1726853154.89055: Calling all_inventory to load vars for managed_node3 7554 1726853154.89057: Calling groups_inventory to load vars for managed_node3 7554 1726853154.89059: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853154.89069: Calling all_plugins_play to load vars for managed_node3 7554 1726853154.89079: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853154.89084: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000003a8 7554 1726853154.89086: WORKER PROCESS EXITING 7554 1726853154.89089: Calling groups_plugins_play to load vars for managed_node3 7554 1726853154.89306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853154.89440: done with get_vars() 7554 1726853154.89450: done getting variables 7554 1726853154.89507: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853154.89633: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 13:25:54 -0400 (0:00:03.322) 0:00:08.864 ****** 7554 1726853154.89683: entering _queue_task() for managed_node3/command 7554 1726853154.89928: worker is 1 (out of 1 available) 7554 1726853154.89942: exiting _queue_task() for managed_node3/command 7554 1726853154.89959: done queuing things up, now waiting for results queue to drain 7554 1726853154.89961: waiting for pending results... 7554 1726853154.90235: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7554 1726853154.90354: in run() - task 02083763-bbaf-bdc3-98b6-0000000003a9 7554 1726853154.90384: variable 'ansible_search_path' from source: unknown 7554 1726853154.90398: variable 'ansible_search_path' from source: unknown 7554 1726853154.91079: variable 'interface' from source: play vars 7554 1726853154.91082: variable 'interface' from source: play vars 7554 1726853154.91130: variable 'interface' from source: play vars 7554 1726853154.91293: Loaded config def from plugin (lookup/items) 7554 1726853154.91315: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7554 1726853154.91341: variable 'omit' from source: magic vars 7554 1726853154.91482: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853154.91504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853154.91520: variable 'omit' from source: magic vars 7554 1726853154.91756: variable 'ansible_distribution_major_version' from source: facts 7554 1726853154.91767: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853154.92000: variable 'type' from source: play vars 7554 1726853154.92016: variable 'state' from source: include params 7554 1726853154.92033: variable 'interface' from source: play vars 7554 1726853154.92050: variable 'current_interfaces' from source: set_fact 7554 1726853154.92062: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7554 1726853154.92080: variable 'omit' from source: magic vars 7554 1726853154.92129: variable 'omit' from source: magic vars 7554 1726853154.92166: variable 'item' from source: unknown 7554 1726853154.92222: variable 'item' from source: unknown 7554 1726853154.92248: variable 'omit' from source: magic vars 7554 1726853154.92273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853154.92296: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853154.92313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853154.92324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853154.92332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853154.92359: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853154.92362: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853154.92365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853154.92433: Set connection var ansible_shell_executable to /bin/sh 7554 1726853154.92440: Set connection var ansible_pipelining to False 7554 1726853154.92444: Set connection var ansible_shell_type to sh 7554 1726853154.92449: Set connection var ansible_connection to ssh 7554 1726853154.92455: Set connection var ansible_timeout to 10 7554 1726853154.92458: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853154.92474: variable 'ansible_shell_executable' from source: unknown 7554 1726853154.92477: variable 'ansible_connection' from source: unknown 7554 1726853154.92480: variable 'ansible_module_compression' from source: unknown 7554 1726853154.92483: variable 'ansible_shell_type' from source: unknown 7554 1726853154.92485: variable 'ansible_shell_executable' from source: unknown 7554 1726853154.92487: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853154.92492: variable 'ansible_pipelining' from source: unknown 7554 1726853154.92494: variable 'ansible_timeout' from source: unknown 7554 1726853154.92497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853154.92593: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853154.92601: variable 'omit' from source: magic vars 7554 1726853154.92607: starting attempt loop 7554 1726853154.92610: running the handler 7554 1726853154.92623: _low_level_execute_command(): starting 7554 1726853154.92631: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853154.93176: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853154.93180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853154.93261: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853154.93321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853154.95017: stdout chunk (state=3): >>>/root <<< 7554 1726853154.95196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853154.95199: stdout chunk (state=3): >>><<< 7554 1726853154.95202: stderr chunk (state=3): >>><<< 7554 1726853154.95231: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853154.95259: _low_level_execute_command(): starting 7554 1726853154.95262: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853154.9523187-7954-90123945785788 `" && echo ansible-tmp-1726853154.9523187-7954-90123945785788="` echo /root/.ansible/tmp/ansible-tmp-1726853154.9523187-7954-90123945785788 `" ) && sleep 0' 7554 1726853154.95687: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853154.95690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853154.95693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853154.95696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853154.95698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853154.95742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853154.95746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853154.95813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853154.97763: stdout chunk (state=3): >>>ansible-tmp-1726853154.9523187-7954-90123945785788=/root/.ansible/tmp/ansible-tmp-1726853154.9523187-7954-90123945785788 <<< 7554 1726853154.97878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853154.97902: stderr chunk (state=3): >>><<< 7554 1726853154.97904: stdout chunk (state=3): >>><<< 7554 1726853154.97919: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853154.9523187-7954-90123945785788=/root/.ansible/tmp/ansible-tmp-1726853154.9523187-7954-90123945785788 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853154.98079: variable 'ansible_module_compression' from source: unknown 7554 1726853154.98082: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853154.98084: variable 'ansible_facts' from source: unknown 7554 1726853154.98086: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853154.9523187-7954-90123945785788/AnsiballZ_command.py 7554 1726853154.98145: Sending initial data 7554 1726853154.98155: Sent initial data (153 bytes) 7554 1726853154.98536: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853154.98548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853154.98568: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853154.98573: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853154.98620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853154.98624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853154.98691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853155.00317: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7554 1726853155.00321: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853155.00375: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853155.00432: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp7ygyj9q2 /root/.ansible/tmp/ansible-tmp-1726853154.9523187-7954-90123945785788/AnsiballZ_command.py <<< 7554 1726853155.00435: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853154.9523187-7954-90123945785788/AnsiballZ_command.py" <<< 7554 1726853155.00487: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp7ygyj9q2" to remote "/root/.ansible/tmp/ansible-tmp-1726853154.9523187-7954-90123945785788/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853154.9523187-7954-90123945785788/AnsiballZ_command.py" <<< 7554 1726853155.01094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853155.01126: stderr chunk (state=3): >>><<< 7554 1726853155.01130: stdout chunk (state=3): >>><<< 7554 1726853155.01143: done transferring module to remote 7554 1726853155.01151: _low_level_execute_command(): starting 7554 1726853155.01156: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853154.9523187-7954-90123945785788/ /root/.ansible/tmp/ansible-tmp-1726853154.9523187-7954-90123945785788/AnsiballZ_command.py && sleep 0' 7554 1726853155.01545: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853155.01548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.01554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853155.01556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853155.01558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.01605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853155.01609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853155.01672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853155.03494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853155.03516: stderr chunk (state=3): >>><<< 7554 1726853155.03519: stdout chunk (state=3): >>><<< 7554 1726853155.03530: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853155.03533: _low_level_execute_command(): starting 7554 1726853155.03536: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853154.9523187-7954-90123945785788/AnsiballZ_command.py && sleep 0' 7554 1726853155.03926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853155.03931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.03933: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853155.03935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853155.03938: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.03988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853155.03991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853155.04064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853155.27770: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 13:25:55.197703", "end": "2024-09-20 13:25:55.270715", "delta": "0:00:00.073012", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853155.30102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853155.30133: stderr chunk (state=3): >>><<< 7554 1726853155.30135: stdout chunk (state=3): >>><<< 7554 1726853155.30178: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 13:25:55.197703", "end": "2024-09-20 13:25:55.270715", "delta": "0:00:00.073012", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853155.30187: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853154.9523187-7954-90123945785788/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853155.30192: _low_level_execute_command(): starting 7554 1726853155.30197: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853154.9523187-7954-90123945785788/ > /dev/null 2>&1 && sleep 0' 7554 1726853155.30645: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853155.30648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.30651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853155.30653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.30708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853155.30713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853155.30715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853155.30799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853155.35711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853155.35736: stderr chunk (state=3): >>><<< 7554 1726853155.35739: stdout chunk (state=3): >>><<< 7554 1726853155.35755: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853155.35761: handler run complete 7554 1726853155.35782: Evaluated conditional (False): False 7554 1726853155.35790: attempt loop complete, returning result 7554 1726853155.35805: variable 'item' from source: unknown 7554 1726853155.35873: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.073012", "end": "2024-09-20 13:25:55.270715", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-20 13:25:55.197703" } 7554 1726853155.36044: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853155.36050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853155.36053: variable 'omit' from source: magic vars 7554 1726853155.36143: variable 'ansible_distribution_major_version' from source: facts 7554 1726853155.36149: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853155.36268: variable 'type' from source: play vars 7554 1726853155.36281: variable 'state' from source: include params 7554 1726853155.36284: variable 'interface' from source: play vars 7554 1726853155.36286: variable 'current_interfaces' from source: set_fact 7554 1726853155.36289: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7554 1726853155.36291: variable 'omit' from source: magic vars 7554 1726853155.36301: variable 'omit' from source: magic vars 7554 1726853155.36327: variable 'item' from source: unknown 7554 1726853155.36370: variable 'item' from source: unknown 7554 1726853155.36387: variable 'omit' from source: magic vars 7554 1726853155.36401: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853155.36409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853155.36414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853155.36424: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853155.36428: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853155.36430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853155.36480: Set connection var ansible_shell_executable to /bin/sh 7554 1726853155.36486: Set connection var ansible_pipelining to False 7554 1726853155.36497: Set connection var ansible_shell_type to sh 7554 1726853155.36500: Set connection var ansible_connection to ssh 7554 1726853155.36502: Set connection var ansible_timeout to 10 7554 1726853155.36504: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853155.36519: variable 'ansible_shell_executable' from source: unknown 7554 1726853155.36521: variable 'ansible_connection' from source: unknown 7554 1726853155.36524: variable 'ansible_module_compression' from source: unknown 7554 1726853155.36526: variable 'ansible_shell_type' from source: unknown 7554 1726853155.36528: variable 'ansible_shell_executable' from source: unknown 7554 1726853155.36531: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853155.36535: variable 'ansible_pipelining' from source: unknown 7554 1726853155.36538: variable 'ansible_timeout' from source: unknown 7554 1726853155.36542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853155.36607: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853155.36618: variable 'omit' from source: magic vars 7554 1726853155.36621: starting attempt loop 7554 1726853155.36624: running the handler 7554 1726853155.36630: _low_level_execute_command(): starting 7554 1726853155.36633: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853155.37093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853155.37096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853155.37098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.37104: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853155.37106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.37158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853155.37161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853155.37164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853155.37229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853155.38894: stdout chunk (state=3): >>>/root <<< 7554 1726853155.39174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853155.39178: stdout chunk (state=3): >>><<< 7554 1726853155.39181: stderr chunk (state=3): >>><<< 7554 1726853155.39292: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853155.39296: _low_level_execute_command(): starting 7554 1726853155.39299: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853155.3920252-7954-112531043252137 `" && echo ansible-tmp-1726853155.3920252-7954-112531043252137="` echo /root/.ansible/tmp/ansible-tmp-1726853155.3920252-7954-112531043252137 `" ) && sleep 0' 7554 1726853155.39876: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853155.39893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853155.39908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853155.40030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853155.40057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853155.40167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853155.42104: stdout chunk (state=3): >>>ansible-tmp-1726853155.3920252-7954-112531043252137=/root/.ansible/tmp/ansible-tmp-1726853155.3920252-7954-112531043252137 <<< 7554 1726853155.42210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853155.42234: stderr chunk (state=3): >>><<< 7554 1726853155.42237: stdout chunk (state=3): >>><<< 7554 1726853155.42253: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853155.3920252-7954-112531043252137=/root/.ansible/tmp/ansible-tmp-1726853155.3920252-7954-112531043252137 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853155.42274: variable 'ansible_module_compression' from source: unknown 7554 1726853155.42319: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853155.42335: variable 'ansible_facts' from source: unknown 7554 1726853155.42382: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853155.3920252-7954-112531043252137/AnsiballZ_command.py 7554 1726853155.42465: Sending initial data 7554 1726853155.42468: Sent initial data (154 bytes) 7554 1726853155.42893: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853155.42896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.42899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853155.42901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853155.42903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.42950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853155.42959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853155.43014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853155.44659: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 7554 1726853155.44665: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853155.44716: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853155.44774: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpktg5w95v /root/.ansible/tmp/ansible-tmp-1726853155.3920252-7954-112531043252137/AnsiballZ_command.py <<< 7554 1726853155.44777: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853155.3920252-7954-112531043252137/AnsiballZ_command.py" <<< 7554 1726853155.44827: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpktg5w95v" to remote "/root/.ansible/tmp/ansible-tmp-1726853155.3920252-7954-112531043252137/AnsiballZ_command.py" <<< 7554 1726853155.44834: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853155.3920252-7954-112531043252137/AnsiballZ_command.py" <<< 7554 1726853155.45429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853155.45466: stderr chunk (state=3): >>><<< 7554 1726853155.45470: stdout chunk (state=3): >>><<< 7554 1726853155.45513: done transferring module to remote 7554 1726853155.45521: _low_level_execute_command(): starting 7554 1726853155.45526: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853155.3920252-7954-112531043252137/ /root/.ansible/tmp/ansible-tmp-1726853155.3920252-7954-112531043252137/AnsiballZ_command.py && sleep 0' 7554 1726853155.45949: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853155.45952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853155.45954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853155.45956: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.46005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853155.46009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853155.46075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853155.48068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853155.48073: stdout chunk (state=3): >>><<< 7554 1726853155.48076: stderr chunk (state=3): >>><<< 7554 1726853155.48078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853155.48081: _low_level_execute_command(): starting 7554 1726853155.48083: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853155.3920252-7954-112531043252137/AnsiballZ_command.py && sleep 0' 7554 1726853155.48610: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853155.48688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.48722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853155.48738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853155.48759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853155.48859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853155.65125: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 13:25:55.644938", "end": "2024-09-20 13:25:55.648883", "delta": "0:00:00.003945", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853155.67102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853155.67110: stdout chunk (state=3): >>><<< 7554 1726853155.67112: stderr chunk (state=3): >>><<< 7554 1726853155.67243: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 13:25:55.644938", "end": "2024-09-20 13:25:55.648883", "delta": "0:00:00.003945", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853155.67247: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853155.3920252-7954-112531043252137/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853155.67250: _low_level_execute_command(): starting 7554 1726853155.67252: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853155.3920252-7954-112531043252137/ > /dev/null 2>&1 && sleep 0' 7554 1726853155.67994: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.68014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853155.68043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853155.68130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853155.70092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853155.70138: stdout chunk (state=3): >>><<< 7554 1726853155.70151: stderr chunk (state=3): >>><<< 7554 1726853155.70276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853155.70279: handler run complete 7554 1726853155.70282: Evaluated conditional (False): False 7554 1726853155.70284: attempt loop complete, returning result 7554 1726853155.70337: variable 'item' from source: unknown 7554 1726853155.70572: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003945", "end": "2024-09-20 13:25:55.648883", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-20 13:25:55.644938" } 7554 1726853155.70798: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853155.70801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853155.70804: variable 'omit' from source: magic vars 7554 1726853155.70918: variable 'ansible_distribution_major_version' from source: facts 7554 1726853155.70929: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853155.71123: variable 'type' from source: play vars 7554 1726853155.71133: variable 'state' from source: include params 7554 1726853155.71141: variable 'interface' from source: play vars 7554 1726853155.71149: variable 'current_interfaces' from source: set_fact 7554 1726853155.71159: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7554 1726853155.71168: variable 'omit' from source: magic vars 7554 1726853155.71188: variable 'omit' from source: magic vars 7554 1726853155.71240: variable 'item' from source: unknown 7554 1726853155.71329: variable 'item' from source: unknown 7554 1726853155.71332: variable 'omit' from source: magic vars 7554 1726853155.71355: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853155.71368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853155.71382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853155.71437: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853155.71441: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853155.71445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853155.71506: Set connection var ansible_shell_executable to /bin/sh 7554 1726853155.71519: Set connection var ansible_pipelining to False 7554 1726853155.71527: Set connection var ansible_shell_type to sh 7554 1726853155.71534: Set connection var ansible_connection to ssh 7554 1726853155.71560: Set connection var ansible_timeout to 10 7554 1726853155.71575: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853155.71655: variable 'ansible_shell_executable' from source: unknown 7554 1726853155.71659: variable 'ansible_connection' from source: unknown 7554 1726853155.71661: variable 'ansible_module_compression' from source: unknown 7554 1726853155.71665: variable 'ansible_shell_type' from source: unknown 7554 1726853155.71667: variable 'ansible_shell_executable' from source: unknown 7554 1726853155.71669: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853155.71673: variable 'ansible_pipelining' from source: unknown 7554 1726853155.71675: variable 'ansible_timeout' from source: unknown 7554 1726853155.71677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853155.71764: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853155.71767: variable 'omit' from source: magic vars 7554 1726853155.71769: starting attempt loop 7554 1726853155.71773: running the handler 7554 1726853155.71775: _low_level_execute_command(): starting 7554 1726853155.71777: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853155.72422: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853155.72425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.72497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853155.72519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853155.72551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853155.72850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853155.74412: stdout chunk (state=3): >>>/root <<< 7554 1726853155.74517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853155.74548: stderr chunk (state=3): >>><<< 7554 1726853155.74559: stdout chunk (state=3): >>><<< 7554 1726853155.74584: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853155.74598: _low_level_execute_command(): starting 7554 1726853155.74608: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853155.7458842-7954-42184201637393 `" && echo ansible-tmp-1726853155.7458842-7954-42184201637393="` echo /root/.ansible/tmp/ansible-tmp-1726853155.7458842-7954-42184201637393 `" ) && sleep 0' 7554 1726853155.75200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853155.75211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853155.75223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853155.75237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853155.75284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.75342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853155.75360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853155.75381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853155.75465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853155.77431: stdout chunk (state=3): >>>ansible-tmp-1726853155.7458842-7954-42184201637393=/root/.ansible/tmp/ansible-tmp-1726853155.7458842-7954-42184201637393 <<< 7554 1726853155.77562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853155.77592: stdout chunk (state=3): >>><<< 7554 1726853155.77604: stderr chunk (state=3): >>><<< 7554 1726853155.77623: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853155.7458842-7954-42184201637393=/root/.ansible/tmp/ansible-tmp-1726853155.7458842-7954-42184201637393 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853155.77655: variable 'ansible_module_compression' from source: unknown 7554 1726853155.77706: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853155.78079: variable 'ansible_facts' from source: unknown 7554 1726853155.78083: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853155.7458842-7954-42184201637393/AnsiballZ_command.py 7554 1726853155.78138: Sending initial data 7554 1726853155.78141: Sent initial data (153 bytes) 7554 1726853155.78666: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853155.78688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853155.78780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853155.78801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853155.78877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853155.80514: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853155.80596: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853155.80688: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmprf5ltapo /root/.ansible/tmp/ansible-tmp-1726853155.7458842-7954-42184201637393/AnsiballZ_command.py <<< 7554 1726853155.80697: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853155.7458842-7954-42184201637393/AnsiballZ_command.py" <<< 7554 1726853155.80731: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmprf5ltapo" to remote "/root/.ansible/tmp/ansible-tmp-1726853155.7458842-7954-42184201637393/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853155.7458842-7954-42184201637393/AnsiballZ_command.py" <<< 7554 1726853155.81544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853155.81558: stderr chunk (state=3): >>><<< 7554 1726853155.81567: stdout chunk (state=3): >>><<< 7554 1726853155.81639: done transferring module to remote 7554 1726853155.81653: _low_level_execute_command(): starting 7554 1726853155.81662: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853155.7458842-7954-42184201637393/ /root/.ansible/tmp/ansible-tmp-1726853155.7458842-7954-42184201637393/AnsiballZ_command.py && sleep 0' 7554 1726853155.82311: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853155.82386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.82434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853155.82450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853155.82473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853155.82559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853155.84393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853155.84425: stderr chunk (state=3): >>><<< 7554 1726853155.84432: stdout chunk (state=3): >>><<< 7554 1726853155.84447: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853155.84453: _low_level_execute_command(): starting 7554 1726853155.84459: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853155.7458842-7954-42184201637393/AnsiballZ_command.py && sleep 0' 7554 1726853155.84874: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853155.84895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853155.84902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853155.84953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853155.84957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853155.85029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853156.01139: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 13:25:56.005125", "end": "2024-09-20 13:25:56.009196", "delta": "0:00:00.004071", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853156.02776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853156.02801: stderr chunk (state=3): >>><<< 7554 1726853156.02806: stdout chunk (state=3): >>><<< 7554 1726853156.02823: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 13:25:56.005125", "end": "2024-09-20 13:25:56.009196", "delta": "0:00:00.004071", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853156.02842: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853155.7458842-7954-42184201637393/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853156.02849: _low_level_execute_command(): starting 7554 1726853156.02855: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853155.7458842-7954-42184201637393/ > /dev/null 2>&1 && sleep 0' 7554 1726853156.03249: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853156.03290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853156.03293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.03295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.03297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853156.03299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.03340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853156.03343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853156.03408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853156.05251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853156.05273: stderr chunk (state=3): >>><<< 7554 1726853156.05277: stdout chunk (state=3): >>><<< 7554 1726853156.05286: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853156.05291: handler run complete 7554 1726853156.05305: Evaluated conditional (False): False 7554 1726853156.05316: attempt loop complete, returning result 7554 1726853156.05330: variable 'item' from source: unknown 7554 1726853156.05391: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.004071", "end": "2024-09-20 13:25:56.009196", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-20 13:25:56.005125" } 7554 1726853156.05507: dumping result to json 7554 1726853156.05510: done dumping result, returning 7554 1726853156.05512: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [02083763-bbaf-bdc3-98b6-0000000003a9] 7554 1726853156.05514: sending task result for task 02083763-bbaf-bdc3-98b6-0000000003a9 7554 1726853156.05556: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000003a9 7554 1726853156.05558: WORKER PROCESS EXITING 7554 1726853156.05619: no more pending results, returning what we have 7554 1726853156.05622: results queue empty 7554 1726853156.05623: checking for any_errors_fatal 7554 1726853156.05628: done checking for any_errors_fatal 7554 1726853156.05629: checking for max_fail_percentage 7554 1726853156.05630: done checking for max_fail_percentage 7554 1726853156.05630: checking to see if all hosts have failed and the running result is not ok 7554 1726853156.05631: done checking to see if all hosts have failed 7554 1726853156.05632: getting the remaining hosts for this loop 7554 1726853156.05633: done getting the remaining hosts for this loop 7554 1726853156.05636: getting the next task for host managed_node3 7554 1726853156.05641: done getting next task for host managed_node3 7554 1726853156.05644: ^ task is: TASK: Set up veth as managed by NetworkManager 7554 1726853156.05647: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853156.05650: getting variables 7554 1726853156.05651: in VariableManager get_vars() 7554 1726853156.05702: Calling all_inventory to load vars for managed_node3 7554 1726853156.05705: Calling groups_inventory to load vars for managed_node3 7554 1726853156.05707: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853156.05718: Calling all_plugins_play to load vars for managed_node3 7554 1726853156.05720: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853156.05723: Calling groups_plugins_play to load vars for managed_node3 7554 1726853156.05881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853156.06005: done with get_vars() 7554 1726853156.06013: done getting variables 7554 1726853156.06054: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 13:25:56 -0400 (0:00:01.163) 0:00:10.028 ****** 7554 1726853156.06075: entering _queue_task() for managed_node3/command 7554 1726853156.06266: worker is 1 (out of 1 available) 7554 1726853156.06280: exiting _queue_task() for managed_node3/command 7554 1726853156.06293: done queuing things up, now waiting for results queue to drain 7554 1726853156.06294: waiting for pending results... 7554 1726853156.06452: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7554 1726853156.06508: in run() - task 02083763-bbaf-bdc3-98b6-0000000003aa 7554 1726853156.06549: variable 'ansible_search_path' from source: unknown 7554 1726853156.06553: variable 'ansible_search_path' from source: unknown 7554 1726853156.06556: calling self._execute() 7554 1726853156.06626: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853156.06629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853156.06640: variable 'omit' from source: magic vars 7554 1726853156.06910: variable 'ansible_distribution_major_version' from source: facts 7554 1726853156.06920: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853156.07028: variable 'type' from source: play vars 7554 1726853156.07032: variable 'state' from source: include params 7554 1726853156.07037: Evaluated conditional (type == 'veth' and state == 'present'): True 7554 1726853156.07043: variable 'omit' from source: magic vars 7554 1726853156.07074: variable 'omit' from source: magic vars 7554 1726853156.07144: variable 'interface' from source: play vars 7554 1726853156.07159: variable 'omit' from source: magic vars 7554 1726853156.07197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853156.07222: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853156.07238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853156.07253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853156.07262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853156.07288: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853156.07291: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853156.07293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853156.07363: Set connection var ansible_shell_executable to /bin/sh 7554 1726853156.07369: Set connection var ansible_pipelining to False 7554 1726853156.07374: Set connection var ansible_shell_type to sh 7554 1726853156.07376: Set connection var ansible_connection to ssh 7554 1726853156.07384: Set connection var ansible_timeout to 10 7554 1726853156.07389: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853156.07407: variable 'ansible_shell_executable' from source: unknown 7554 1726853156.07410: variable 'ansible_connection' from source: unknown 7554 1726853156.07413: variable 'ansible_module_compression' from source: unknown 7554 1726853156.07416: variable 'ansible_shell_type' from source: unknown 7554 1726853156.07418: variable 'ansible_shell_executable' from source: unknown 7554 1726853156.07420: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853156.07424: variable 'ansible_pipelining' from source: unknown 7554 1726853156.07426: variable 'ansible_timeout' from source: unknown 7554 1726853156.07434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853156.07532: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853156.07540: variable 'omit' from source: magic vars 7554 1726853156.07551: starting attempt loop 7554 1726853156.07554: running the handler 7554 1726853156.07566: _low_level_execute_command(): starting 7554 1726853156.07574: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853156.08083: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.08087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.08090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.08093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.08143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853156.08147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853156.08149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853156.08220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853156.09870: stdout chunk (state=3): >>>/root <<< 7554 1726853156.09965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853156.09992: stderr chunk (state=3): >>><<< 7554 1726853156.09995: stdout chunk (state=3): >>><<< 7554 1726853156.10015: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853156.10028: _low_level_execute_command(): starting 7554 1726853156.10034: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853156.1001513-8027-117526197658271 `" && echo ansible-tmp-1726853156.1001513-8027-117526197658271="` echo /root/.ansible/tmp/ansible-tmp-1726853156.1001513-8027-117526197658271 `" ) && sleep 0' 7554 1726853156.10460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.10472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853156.10475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853156.10477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853156.10479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.10516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853156.10520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853156.10587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853156.12513: stdout chunk (state=3): >>>ansible-tmp-1726853156.1001513-8027-117526197658271=/root/.ansible/tmp/ansible-tmp-1726853156.1001513-8027-117526197658271 <<< 7554 1726853156.12621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853156.12650: stderr chunk (state=3): >>><<< 7554 1726853156.12653: stdout chunk (state=3): >>><<< 7554 1726853156.12665: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853156.1001513-8027-117526197658271=/root/.ansible/tmp/ansible-tmp-1726853156.1001513-8027-117526197658271 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853156.12691: variable 'ansible_module_compression' from source: unknown 7554 1726853156.12728: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853156.12762: variable 'ansible_facts' from source: unknown 7554 1726853156.12818: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853156.1001513-8027-117526197658271/AnsiballZ_command.py 7554 1726853156.12914: Sending initial data 7554 1726853156.12917: Sent initial data (154 bytes) 7554 1726853156.13350: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853156.13353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853156.13356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.13358: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.13360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.13415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853156.13418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853156.13473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853156.15058: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7554 1726853156.15061: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853156.15115: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853156.15175: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpxemcsnx4 /root/.ansible/tmp/ansible-tmp-1726853156.1001513-8027-117526197658271/AnsiballZ_command.py <<< 7554 1726853156.15179: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853156.1001513-8027-117526197658271/AnsiballZ_command.py" <<< 7554 1726853156.15228: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpxemcsnx4" to remote "/root/.ansible/tmp/ansible-tmp-1726853156.1001513-8027-117526197658271/AnsiballZ_command.py" <<< 7554 1726853156.15239: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853156.1001513-8027-117526197658271/AnsiballZ_command.py" <<< 7554 1726853156.15830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853156.15866: stderr chunk (state=3): >>><<< 7554 1726853156.15869: stdout chunk (state=3): >>><<< 7554 1726853156.15892: done transferring module to remote 7554 1726853156.15898: _low_level_execute_command(): starting 7554 1726853156.15903: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853156.1001513-8027-117526197658271/ /root/.ansible/tmp/ansible-tmp-1726853156.1001513-8027-117526197658271/AnsiballZ_command.py && sleep 0' 7554 1726853156.16326: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853156.16329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853156.16335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.16337: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.16339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.16386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853156.16389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853156.16457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853156.18309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853156.18312: stdout chunk (state=3): >>><<< 7554 1726853156.18318: stderr chunk (state=3): >>><<< 7554 1726853156.18329: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853156.18332: _low_level_execute_command(): starting 7554 1726853156.18336: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853156.1001513-8027-117526197658271/AnsiballZ_command.py && sleep 0' 7554 1726853156.18748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853156.18751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853156.18754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.18756: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.18758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.18807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853156.18810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853156.18886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853156.47227: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 13:25:56.345245", "end": "2024-09-20 13:25:56.470347", "delta": "0:00:00.125102", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853156.48863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853156.48889: stderr chunk (state=3): >>><<< 7554 1726853156.48893: stdout chunk (state=3): >>><<< 7554 1726853156.48908: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 13:25:56.345245", "end": "2024-09-20 13:25:56.470347", "delta": "0:00:00.125102", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853156.48940: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853156.1001513-8027-117526197658271/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853156.48946: _low_level_execute_command(): starting 7554 1726853156.48953: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853156.1001513-8027-117526197658271/ > /dev/null 2>&1 && sleep 0' 7554 1726853156.49403: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.49406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.49409: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.49411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.49461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853156.49464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853156.49531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853156.51404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853156.51430: stderr chunk (state=3): >>><<< 7554 1726853156.51434: stdout chunk (state=3): >>><<< 7554 1726853156.51449: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853156.51452: handler run complete 7554 1726853156.51469: Evaluated conditional (False): False 7554 1726853156.51480: attempt loop complete, returning result 7554 1726853156.51483: _execute() done 7554 1726853156.51485: dumping result to json 7554 1726853156.51489: done dumping result, returning 7554 1726853156.51497: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [02083763-bbaf-bdc3-98b6-0000000003aa] 7554 1726853156.51502: sending task result for task 02083763-bbaf-bdc3-98b6-0000000003aa 7554 1726853156.51603: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000003aa 7554 1726853156.51606: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.125102", "end": "2024-09-20 13:25:56.470347", "rc": 0, "start": "2024-09-20 13:25:56.345245" } 7554 1726853156.51668: no more pending results, returning what we have 7554 1726853156.51672: results queue empty 7554 1726853156.51673: checking for any_errors_fatal 7554 1726853156.51689: done checking for any_errors_fatal 7554 1726853156.51689: checking for max_fail_percentage 7554 1726853156.51691: done checking for max_fail_percentage 7554 1726853156.51691: checking to see if all hosts have failed and the running result is not ok 7554 1726853156.51692: done checking to see if all hosts have failed 7554 1726853156.51693: getting the remaining hosts for this loop 7554 1726853156.51694: done getting the remaining hosts for this loop 7554 1726853156.51698: getting the next task for host managed_node3 7554 1726853156.51703: done getting next task for host managed_node3 7554 1726853156.51706: ^ task is: TASK: Delete veth interface {{ interface }} 7554 1726853156.51708: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853156.51712: getting variables 7554 1726853156.51714: in VariableManager get_vars() 7554 1726853156.51758: Calling all_inventory to load vars for managed_node3 7554 1726853156.51761: Calling groups_inventory to load vars for managed_node3 7554 1726853156.51763: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853156.51778: Calling all_plugins_play to load vars for managed_node3 7554 1726853156.51781: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853156.51785: Calling groups_plugins_play to load vars for managed_node3 7554 1726853156.51912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853156.52036: done with get_vars() 7554 1726853156.52044: done getting variables 7554 1726853156.52090: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853156.52178: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 13:25:56 -0400 (0:00:00.461) 0:00:10.489 ****** 7554 1726853156.52199: entering _queue_task() for managed_node3/command 7554 1726853156.52399: worker is 1 (out of 1 available) 7554 1726853156.52411: exiting _queue_task() for managed_node3/command 7554 1726853156.52424: done queuing things up, now waiting for results queue to drain 7554 1726853156.52425: waiting for pending results... 7554 1726853156.52581: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7554 1726853156.52643: in run() - task 02083763-bbaf-bdc3-98b6-0000000003ab 7554 1726853156.52658: variable 'ansible_search_path' from source: unknown 7554 1726853156.52664: variable 'ansible_search_path' from source: unknown 7554 1726853156.52695: calling self._execute() 7554 1726853156.52761: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853156.52768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853156.52781: variable 'omit' from source: magic vars 7554 1726853156.53039: variable 'ansible_distribution_major_version' from source: facts 7554 1726853156.53052: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853156.53183: variable 'type' from source: play vars 7554 1726853156.53186: variable 'state' from source: include params 7554 1726853156.53190: variable 'interface' from source: play vars 7554 1726853156.53195: variable 'current_interfaces' from source: set_fact 7554 1726853156.53211: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 7554 1726853156.53215: when evaluation is False, skipping this task 7554 1726853156.53217: _execute() done 7554 1726853156.53220: dumping result to json 7554 1726853156.53223: done dumping result, returning 7554 1726853156.53225: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [02083763-bbaf-bdc3-98b6-0000000003ab] 7554 1726853156.53227: sending task result for task 02083763-bbaf-bdc3-98b6-0000000003ab 7554 1726853156.53300: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000003ab 7554 1726853156.53302: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853156.53354: no more pending results, returning what we have 7554 1726853156.53357: results queue empty 7554 1726853156.53358: checking for any_errors_fatal 7554 1726853156.53368: done checking for any_errors_fatal 7554 1726853156.53369: checking for max_fail_percentage 7554 1726853156.53370: done checking for max_fail_percentage 7554 1726853156.53373: checking to see if all hosts have failed and the running result is not ok 7554 1726853156.53374: done checking to see if all hosts have failed 7554 1726853156.53374: getting the remaining hosts for this loop 7554 1726853156.53375: done getting the remaining hosts for this loop 7554 1726853156.53379: getting the next task for host managed_node3 7554 1726853156.53384: done getting next task for host managed_node3 7554 1726853156.53386: ^ task is: TASK: Create dummy interface {{ interface }} 7554 1726853156.53389: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853156.53393: getting variables 7554 1726853156.53394: in VariableManager get_vars() 7554 1726853156.53433: Calling all_inventory to load vars for managed_node3 7554 1726853156.53435: Calling groups_inventory to load vars for managed_node3 7554 1726853156.53437: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853156.53445: Calling all_plugins_play to load vars for managed_node3 7554 1726853156.53447: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853156.53450: Calling groups_plugins_play to load vars for managed_node3 7554 1726853156.53559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853156.53711: done with get_vars() 7554 1726853156.53718: done getting variables 7554 1726853156.53758: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853156.53833: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 13:25:56 -0400 (0:00:00.016) 0:00:10.506 ****** 7554 1726853156.53853: entering _queue_task() for managed_node3/command 7554 1726853156.54030: worker is 1 (out of 1 available) 7554 1726853156.54043: exiting _queue_task() for managed_node3/command 7554 1726853156.54054: done queuing things up, now waiting for results queue to drain 7554 1726853156.54056: waiting for pending results... 7554 1726853156.54205: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7554 1726853156.54264: in run() - task 02083763-bbaf-bdc3-98b6-0000000003ac 7554 1726853156.54278: variable 'ansible_search_path' from source: unknown 7554 1726853156.54284: variable 'ansible_search_path' from source: unknown 7554 1726853156.54310: calling self._execute() 7554 1726853156.54375: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853156.54378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853156.54388: variable 'omit' from source: magic vars 7554 1726853156.54633: variable 'ansible_distribution_major_version' from source: facts 7554 1726853156.54642: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853156.54776: variable 'type' from source: play vars 7554 1726853156.54780: variable 'state' from source: include params 7554 1726853156.54785: variable 'interface' from source: play vars 7554 1726853156.54789: variable 'current_interfaces' from source: set_fact 7554 1726853156.54797: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7554 1726853156.54799: when evaluation is False, skipping this task 7554 1726853156.54802: _execute() done 7554 1726853156.54804: dumping result to json 7554 1726853156.54807: done dumping result, returning 7554 1726853156.54812: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [02083763-bbaf-bdc3-98b6-0000000003ac] 7554 1726853156.54819: sending task result for task 02083763-bbaf-bdc3-98b6-0000000003ac 7554 1726853156.54891: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000003ac 7554 1726853156.54894: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853156.54963: no more pending results, returning what we have 7554 1726853156.54966: results queue empty 7554 1726853156.54966: checking for any_errors_fatal 7554 1726853156.54979: done checking for any_errors_fatal 7554 1726853156.54980: checking for max_fail_percentage 7554 1726853156.54981: done checking for max_fail_percentage 7554 1726853156.54982: checking to see if all hosts have failed and the running result is not ok 7554 1726853156.54983: done checking to see if all hosts have failed 7554 1726853156.54984: getting the remaining hosts for this loop 7554 1726853156.54985: done getting the remaining hosts for this loop 7554 1726853156.54987: getting the next task for host managed_node3 7554 1726853156.54992: done getting next task for host managed_node3 7554 1726853156.54994: ^ task is: TASK: Delete dummy interface {{ interface }} 7554 1726853156.54997: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853156.55000: getting variables 7554 1726853156.55001: in VariableManager get_vars() 7554 1726853156.55032: Calling all_inventory to load vars for managed_node3 7554 1726853156.55034: Calling groups_inventory to load vars for managed_node3 7554 1726853156.55035: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853156.55042: Calling all_plugins_play to load vars for managed_node3 7554 1726853156.55043: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853156.55045: Calling groups_plugins_play to load vars for managed_node3 7554 1726853156.55153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853156.55270: done with get_vars() 7554 1726853156.55279: done getting variables 7554 1726853156.55318: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853156.55389: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 13:25:56 -0400 (0:00:00.015) 0:00:10.521 ****** 7554 1726853156.55409: entering _queue_task() for managed_node3/command 7554 1726853156.55581: worker is 1 (out of 1 available) 7554 1726853156.55594: exiting _queue_task() for managed_node3/command 7554 1726853156.55605: done queuing things up, now waiting for results queue to drain 7554 1726853156.55606: waiting for pending results... 7554 1726853156.55749: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7554 1726853156.55807: in run() - task 02083763-bbaf-bdc3-98b6-0000000003ad 7554 1726853156.55819: variable 'ansible_search_path' from source: unknown 7554 1726853156.55823: variable 'ansible_search_path' from source: unknown 7554 1726853156.55851: calling self._execute() 7554 1726853156.55915: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853156.55919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853156.55927: variable 'omit' from source: magic vars 7554 1726853156.56166: variable 'ansible_distribution_major_version' from source: facts 7554 1726853156.56177: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853156.56307: variable 'type' from source: play vars 7554 1726853156.56310: variable 'state' from source: include params 7554 1726853156.56313: variable 'interface' from source: play vars 7554 1726853156.56315: variable 'current_interfaces' from source: set_fact 7554 1726853156.56321: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7554 1726853156.56324: when evaluation is False, skipping this task 7554 1726853156.56326: _execute() done 7554 1726853156.56329: dumping result to json 7554 1726853156.56331: done dumping result, returning 7554 1726853156.56337: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [02083763-bbaf-bdc3-98b6-0000000003ad] 7554 1726853156.56342: sending task result for task 02083763-bbaf-bdc3-98b6-0000000003ad 7554 1726853156.56423: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000003ad 7554 1726853156.56426: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853156.56475: no more pending results, returning what we have 7554 1726853156.56477: results queue empty 7554 1726853156.56478: checking for any_errors_fatal 7554 1726853156.56482: done checking for any_errors_fatal 7554 1726853156.56483: checking for max_fail_percentage 7554 1726853156.56484: done checking for max_fail_percentage 7554 1726853156.56485: checking to see if all hosts have failed and the running result is not ok 7554 1726853156.56486: done checking to see if all hosts have failed 7554 1726853156.56487: getting the remaining hosts for this loop 7554 1726853156.56488: done getting the remaining hosts for this loop 7554 1726853156.56490: getting the next task for host managed_node3 7554 1726853156.56496: done getting next task for host managed_node3 7554 1726853156.56497: ^ task is: TASK: Create tap interface {{ interface }} 7554 1726853156.56500: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853156.56503: getting variables 7554 1726853156.56504: in VariableManager get_vars() 7554 1726853156.56547: Calling all_inventory to load vars for managed_node3 7554 1726853156.56549: Calling groups_inventory to load vars for managed_node3 7554 1726853156.56550: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853156.56557: Calling all_plugins_play to load vars for managed_node3 7554 1726853156.56558: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853156.56560: Calling groups_plugins_play to load vars for managed_node3 7554 1726853156.56698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853156.56812: done with get_vars() 7554 1726853156.56818: done getting variables 7554 1726853156.56855: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853156.56924: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 13:25:56 -0400 (0:00:00.015) 0:00:10.536 ****** 7554 1726853156.56942: entering _queue_task() for managed_node3/command 7554 1726853156.57111: worker is 1 (out of 1 available) 7554 1726853156.57124: exiting _queue_task() for managed_node3/command 7554 1726853156.57134: done queuing things up, now waiting for results queue to drain 7554 1726853156.57135: waiting for pending results... 7554 1726853156.57276: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7554 1726853156.57331: in run() - task 02083763-bbaf-bdc3-98b6-0000000003ae 7554 1726853156.57343: variable 'ansible_search_path' from source: unknown 7554 1726853156.57346: variable 'ansible_search_path' from source: unknown 7554 1726853156.57377: calling self._execute() 7554 1726853156.57437: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853156.57441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853156.57452: variable 'omit' from source: magic vars 7554 1726853156.57692: variable 'ansible_distribution_major_version' from source: facts 7554 1726853156.57696: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853156.57821: variable 'type' from source: play vars 7554 1726853156.57825: variable 'state' from source: include params 7554 1726853156.57830: variable 'interface' from source: play vars 7554 1726853156.57833: variable 'current_interfaces' from source: set_fact 7554 1726853156.57840: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7554 1726853156.57843: when evaluation is False, skipping this task 7554 1726853156.57846: _execute() done 7554 1726853156.57851: dumping result to json 7554 1726853156.57853: done dumping result, returning 7554 1726853156.57858: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [02083763-bbaf-bdc3-98b6-0000000003ae] 7554 1726853156.57865: sending task result for task 02083763-bbaf-bdc3-98b6-0000000003ae 7554 1726853156.57939: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000003ae 7554 1726853156.57942: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853156.57989: no more pending results, returning what we have 7554 1726853156.57992: results queue empty 7554 1726853156.57993: checking for any_errors_fatal 7554 1726853156.57998: done checking for any_errors_fatal 7554 1726853156.57999: checking for max_fail_percentage 7554 1726853156.58001: done checking for max_fail_percentage 7554 1726853156.58001: checking to see if all hosts have failed and the running result is not ok 7554 1726853156.58002: done checking to see if all hosts have failed 7554 1726853156.58003: getting the remaining hosts for this loop 7554 1726853156.58004: done getting the remaining hosts for this loop 7554 1726853156.58007: getting the next task for host managed_node3 7554 1726853156.58011: done getting next task for host managed_node3 7554 1726853156.58013: ^ task is: TASK: Delete tap interface {{ interface }} 7554 1726853156.58016: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853156.58019: getting variables 7554 1726853156.58020: in VariableManager get_vars() 7554 1726853156.58063: Calling all_inventory to load vars for managed_node3 7554 1726853156.58066: Calling groups_inventory to load vars for managed_node3 7554 1726853156.58068: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853156.58076: Calling all_plugins_play to load vars for managed_node3 7554 1726853156.58078: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853156.58080: Calling groups_plugins_play to load vars for managed_node3 7554 1726853156.58187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853156.58303: done with get_vars() 7554 1726853156.58310: done getting variables 7554 1726853156.58346: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853156.58418: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 13:25:56 -0400 (0:00:00.014) 0:00:10.551 ****** 7554 1726853156.58437: entering _queue_task() for managed_node3/command 7554 1726853156.58605: worker is 1 (out of 1 available) 7554 1726853156.58620: exiting _queue_task() for managed_node3/command 7554 1726853156.58632: done queuing things up, now waiting for results queue to drain 7554 1726853156.58634: waiting for pending results... 7554 1726853156.58778: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7554 1726853156.58837: in run() - task 02083763-bbaf-bdc3-98b6-0000000003af 7554 1726853156.58851: variable 'ansible_search_path' from source: unknown 7554 1726853156.58857: variable 'ansible_search_path' from source: unknown 7554 1726853156.58886: calling self._execute() 7554 1726853156.58941: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853156.58946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853156.58957: variable 'omit' from source: magic vars 7554 1726853156.59198: variable 'ansible_distribution_major_version' from source: facts 7554 1726853156.59204: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853156.59332: variable 'type' from source: play vars 7554 1726853156.59335: variable 'state' from source: include params 7554 1726853156.59341: variable 'interface' from source: play vars 7554 1726853156.59344: variable 'current_interfaces' from source: set_fact 7554 1726853156.59354: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7554 1726853156.59357: when evaluation is False, skipping this task 7554 1726853156.59359: _execute() done 7554 1726853156.59362: dumping result to json 7554 1726853156.59364: done dumping result, returning 7554 1726853156.59370: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [02083763-bbaf-bdc3-98b6-0000000003af] 7554 1726853156.59376: sending task result for task 02083763-bbaf-bdc3-98b6-0000000003af 7554 1726853156.59452: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000003af 7554 1726853156.59455: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853156.59501: no more pending results, returning what we have 7554 1726853156.59504: results queue empty 7554 1726853156.59505: checking for any_errors_fatal 7554 1726853156.59509: done checking for any_errors_fatal 7554 1726853156.59510: checking for max_fail_percentage 7554 1726853156.59511: done checking for max_fail_percentage 7554 1726853156.59512: checking to see if all hosts have failed and the running result is not ok 7554 1726853156.59513: done checking to see if all hosts have failed 7554 1726853156.59514: getting the remaining hosts for this loop 7554 1726853156.59515: done getting the remaining hosts for this loop 7554 1726853156.59518: getting the next task for host managed_node3 7554 1726853156.59524: done getting next task for host managed_node3 7554 1726853156.59526: ^ task is: TASK: Include the task 'assert_device_present.yml' 7554 1726853156.59528: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853156.59532: getting variables 7554 1726853156.59533: in VariableManager get_vars() 7554 1726853156.59570: Calling all_inventory to load vars for managed_node3 7554 1726853156.59574: Calling groups_inventory to load vars for managed_node3 7554 1726853156.59576: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853156.59584: Calling all_plugins_play to load vars for managed_node3 7554 1726853156.59586: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853156.59589: Calling groups_plugins_play to load vars for managed_node3 7554 1726853156.59726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853156.59838: done with get_vars() 7554 1726853156.59844: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:15 Friday 20 September 2024 13:25:56 -0400 (0:00:00.014) 0:00:10.566 ****** 7554 1726853156.59903: entering _queue_task() for managed_node3/include_tasks 7554 1726853156.60072: worker is 1 (out of 1 available) 7554 1726853156.60086: exiting _queue_task() for managed_node3/include_tasks 7554 1726853156.60098: done queuing things up, now waiting for results queue to drain 7554 1726853156.60099: waiting for pending results... 7554 1726853156.60242: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 7554 1726853156.60296: in run() - task 02083763-bbaf-bdc3-98b6-00000000000d 7554 1726853156.60307: variable 'ansible_search_path' from source: unknown 7554 1726853156.60336: calling self._execute() 7554 1726853156.60399: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853156.60403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853156.60411: variable 'omit' from source: magic vars 7554 1726853156.60661: variable 'ansible_distribution_major_version' from source: facts 7554 1726853156.60670: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853156.60677: _execute() done 7554 1726853156.60680: dumping result to json 7554 1726853156.60683: done dumping result, returning 7554 1726853156.60689: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [02083763-bbaf-bdc3-98b6-00000000000d] 7554 1726853156.60694: sending task result for task 02083763-bbaf-bdc3-98b6-00000000000d 7554 1726853156.60776: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000000d 7554 1726853156.60780: WORKER PROCESS EXITING 7554 1726853156.60807: no more pending results, returning what we have 7554 1726853156.60811: in VariableManager get_vars() 7554 1726853156.60854: Calling all_inventory to load vars for managed_node3 7554 1726853156.60857: Calling groups_inventory to load vars for managed_node3 7554 1726853156.60859: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853156.60867: Calling all_plugins_play to load vars for managed_node3 7554 1726853156.60870: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853156.60874: Calling groups_plugins_play to load vars for managed_node3 7554 1726853156.60990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853156.61101: done with get_vars() 7554 1726853156.61106: variable 'ansible_search_path' from source: unknown 7554 1726853156.61115: we have included files to process 7554 1726853156.61116: generating all_blocks data 7554 1726853156.61118: done generating all_blocks data 7554 1726853156.61121: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7554 1726853156.61122: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7554 1726853156.61124: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7554 1726853156.61223: in VariableManager get_vars() 7554 1726853156.61242: done with get_vars() 7554 1726853156.61312: done processing included file 7554 1726853156.61314: iterating over new_blocks loaded from include file 7554 1726853156.61315: in VariableManager get_vars() 7554 1726853156.61327: done with get_vars() 7554 1726853156.61328: filtering new block on tags 7554 1726853156.61341: done filtering new block on tags 7554 1726853156.61342: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 7554 1726853156.61346: extending task lists for all hosts with included blocks 7554 1726853156.63721: done extending task lists 7554 1726853156.63722: done processing included files 7554 1726853156.63723: results queue empty 7554 1726853156.63724: checking for any_errors_fatal 7554 1726853156.63726: done checking for any_errors_fatal 7554 1726853156.63727: checking for max_fail_percentage 7554 1726853156.63728: done checking for max_fail_percentage 7554 1726853156.63729: checking to see if all hosts have failed and the running result is not ok 7554 1726853156.63729: done checking to see if all hosts have failed 7554 1726853156.63730: getting the remaining hosts for this loop 7554 1726853156.63731: done getting the remaining hosts for this loop 7554 1726853156.63732: getting the next task for host managed_node3 7554 1726853156.63735: done getting next task for host managed_node3 7554 1726853156.63736: ^ task is: TASK: Include the task 'get_interface_stat.yml' 7554 1726853156.63738: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853156.63739: getting variables 7554 1726853156.63740: in VariableManager get_vars() 7554 1726853156.63752: Calling all_inventory to load vars for managed_node3 7554 1726853156.63754: Calling groups_inventory to load vars for managed_node3 7554 1726853156.63755: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853156.63759: Calling all_plugins_play to load vars for managed_node3 7554 1726853156.63761: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853156.63762: Calling groups_plugins_play to load vars for managed_node3 7554 1726853156.63851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853156.63959: done with get_vars() 7554 1726853156.63966: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:25:56 -0400 (0:00:00.041) 0:00:10.607 ****** 7554 1726853156.64015: entering _queue_task() for managed_node3/include_tasks 7554 1726853156.64226: worker is 1 (out of 1 available) 7554 1726853156.64239: exiting _queue_task() for managed_node3/include_tasks 7554 1726853156.64252: done queuing things up, now waiting for results queue to drain 7554 1726853156.64253: waiting for pending results... 7554 1726853156.64418: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 7554 1726853156.64490: in run() - task 02083763-bbaf-bdc3-98b6-0000000005f5 7554 1726853156.64495: variable 'ansible_search_path' from source: unknown 7554 1726853156.64498: variable 'ansible_search_path' from source: unknown 7554 1726853156.64524: calling self._execute() 7554 1726853156.64588: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853156.64593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853156.64604: variable 'omit' from source: magic vars 7554 1726853156.64884: variable 'ansible_distribution_major_version' from source: facts 7554 1726853156.64894: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853156.64899: _execute() done 7554 1726853156.64903: dumping result to json 7554 1726853156.64907: done dumping result, returning 7554 1726853156.64913: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-bdc3-98b6-0000000005f5] 7554 1726853156.64917: sending task result for task 02083763-bbaf-bdc3-98b6-0000000005f5 7554 1726853156.64998: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000005f5 7554 1726853156.65001: WORKER PROCESS EXITING 7554 1726853156.65045: no more pending results, returning what we have 7554 1726853156.65049: in VariableManager get_vars() 7554 1726853156.65099: Calling all_inventory to load vars for managed_node3 7554 1726853156.65101: Calling groups_inventory to load vars for managed_node3 7554 1726853156.65103: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853156.65112: Calling all_plugins_play to load vars for managed_node3 7554 1726853156.65114: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853156.65117: Calling groups_plugins_play to load vars for managed_node3 7554 1726853156.65250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853156.65361: done with get_vars() 7554 1726853156.65367: variable 'ansible_search_path' from source: unknown 7554 1726853156.65368: variable 'ansible_search_path' from source: unknown 7554 1726853156.65394: we have included files to process 7554 1726853156.65395: generating all_blocks data 7554 1726853156.65396: done generating all_blocks data 7554 1726853156.65397: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7554 1726853156.65398: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7554 1726853156.65399: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7554 1726853156.65546: done processing included file 7554 1726853156.65548: iterating over new_blocks loaded from include file 7554 1726853156.65549: in VariableManager get_vars() 7554 1726853156.65564: done with get_vars() 7554 1726853156.65565: filtering new block on tags 7554 1726853156.65577: done filtering new block on tags 7554 1726853156.65578: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 7554 1726853156.65582: extending task lists for all hosts with included blocks 7554 1726853156.65639: done extending task lists 7554 1726853156.65640: done processing included files 7554 1726853156.65640: results queue empty 7554 1726853156.65641: checking for any_errors_fatal 7554 1726853156.65643: done checking for any_errors_fatal 7554 1726853156.65644: checking for max_fail_percentage 7554 1726853156.65645: done checking for max_fail_percentage 7554 1726853156.65646: checking to see if all hosts have failed and the running result is not ok 7554 1726853156.65646: done checking to see if all hosts have failed 7554 1726853156.65647: getting the remaining hosts for this loop 7554 1726853156.65648: done getting the remaining hosts for this loop 7554 1726853156.65649: getting the next task for host managed_node3 7554 1726853156.65652: done getting next task for host managed_node3 7554 1726853156.65653: ^ task is: TASK: Get stat for interface {{ interface }} 7554 1726853156.65655: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853156.65656: getting variables 7554 1726853156.65657: in VariableManager get_vars() 7554 1726853156.65667: Calling all_inventory to load vars for managed_node3 7554 1726853156.65668: Calling groups_inventory to load vars for managed_node3 7554 1726853156.65670: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853156.65675: Calling all_plugins_play to load vars for managed_node3 7554 1726853156.65676: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853156.65678: Calling groups_plugins_play to load vars for managed_node3 7554 1726853156.65759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853156.65884: done with get_vars() 7554 1726853156.65891: done getting variables 7554 1726853156.65994: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:25:56 -0400 (0:00:00.019) 0:00:10.627 ****** 7554 1726853156.66014: entering _queue_task() for managed_node3/stat 7554 1726853156.66208: worker is 1 (out of 1 available) 7554 1726853156.66223: exiting _queue_task() for managed_node3/stat 7554 1726853156.66234: done queuing things up, now waiting for results queue to drain 7554 1726853156.66235: waiting for pending results... 7554 1726853156.66394: running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 7554 1726853156.66462: in run() - task 02083763-bbaf-bdc3-98b6-0000000007ee 7554 1726853156.66474: variable 'ansible_search_path' from source: unknown 7554 1726853156.66478: variable 'ansible_search_path' from source: unknown 7554 1726853156.66507: calling self._execute() 7554 1726853156.66570: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853156.66582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853156.66590: variable 'omit' from source: magic vars 7554 1726853156.66844: variable 'ansible_distribution_major_version' from source: facts 7554 1726853156.66855: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853156.66860: variable 'omit' from source: magic vars 7554 1726853156.66892: variable 'omit' from source: magic vars 7554 1726853156.66958: variable 'interface' from source: play vars 7554 1726853156.66973: variable 'omit' from source: magic vars 7554 1726853156.67008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853156.67036: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853156.67052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853156.67066: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853156.67077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853156.67102: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853156.67105: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853156.67108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853156.67180: Set connection var ansible_shell_executable to /bin/sh 7554 1726853156.67187: Set connection var ansible_pipelining to False 7554 1726853156.67190: Set connection var ansible_shell_type to sh 7554 1726853156.67192: Set connection var ansible_connection to ssh 7554 1726853156.67200: Set connection var ansible_timeout to 10 7554 1726853156.67205: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853156.67223: variable 'ansible_shell_executable' from source: unknown 7554 1726853156.67226: variable 'ansible_connection' from source: unknown 7554 1726853156.67229: variable 'ansible_module_compression' from source: unknown 7554 1726853156.67231: variable 'ansible_shell_type' from source: unknown 7554 1726853156.67233: variable 'ansible_shell_executable' from source: unknown 7554 1726853156.67236: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853156.67238: variable 'ansible_pipelining' from source: unknown 7554 1726853156.67240: variable 'ansible_timeout' from source: unknown 7554 1726853156.67242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853156.67385: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853156.67394: variable 'omit' from source: magic vars 7554 1726853156.67399: starting attempt loop 7554 1726853156.67402: running the handler 7554 1726853156.67413: _low_level_execute_command(): starting 7554 1726853156.67420: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853156.67936: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.67940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.67942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.67944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.67998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853156.68002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853156.68010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853156.68073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853156.69777: stdout chunk (state=3): >>>/root <<< 7554 1726853156.69879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853156.69906: stderr chunk (state=3): >>><<< 7554 1726853156.69909: stdout chunk (state=3): >>><<< 7554 1726853156.69929: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853156.69942: _low_level_execute_command(): starting 7554 1726853156.69948: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853156.6992998-8049-192474262026351 `" && echo ansible-tmp-1726853156.6992998-8049-192474262026351="` echo /root/.ansible/tmp/ansible-tmp-1726853156.6992998-8049-192474262026351 `" ) && sleep 0' 7554 1726853156.70400: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.70405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.70415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853156.70417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853156.70420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.70460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853156.70463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853156.70469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853156.70537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853156.72485: stdout chunk (state=3): >>>ansible-tmp-1726853156.6992998-8049-192474262026351=/root/.ansible/tmp/ansible-tmp-1726853156.6992998-8049-192474262026351 <<< 7554 1726853156.72589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853156.72618: stderr chunk (state=3): >>><<< 7554 1726853156.72622: stdout chunk (state=3): >>><<< 7554 1726853156.72641: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853156.6992998-8049-192474262026351=/root/.ansible/tmp/ansible-tmp-1726853156.6992998-8049-192474262026351 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853156.72687: variable 'ansible_module_compression' from source: unknown 7554 1726853156.72730: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7554 1726853156.72763: variable 'ansible_facts' from source: unknown 7554 1726853156.72828: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853156.6992998-8049-192474262026351/AnsiballZ_stat.py 7554 1726853156.72931: Sending initial data 7554 1726853156.72934: Sent initial data (151 bytes) 7554 1726853156.73395: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853156.73400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853156.73402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.73404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.73406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.73453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853156.73470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853156.73524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853156.75115: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7554 1726853156.75119: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853156.75172: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853156.75232: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp_3zjr1r_ /root/.ansible/tmp/ansible-tmp-1726853156.6992998-8049-192474262026351/AnsiballZ_stat.py <<< 7554 1726853156.75235: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853156.6992998-8049-192474262026351/AnsiballZ_stat.py" <<< 7554 1726853156.75291: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp_3zjr1r_" to remote "/root/.ansible/tmp/ansible-tmp-1726853156.6992998-8049-192474262026351/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853156.6992998-8049-192474262026351/AnsiballZ_stat.py" <<< 7554 1726853156.75897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853156.75937: stderr chunk (state=3): >>><<< 7554 1726853156.75941: stdout chunk (state=3): >>><<< 7554 1726853156.75974: done transferring module to remote 7554 1726853156.75983: _low_level_execute_command(): starting 7554 1726853156.75987: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853156.6992998-8049-192474262026351/ /root/.ansible/tmp/ansible-tmp-1726853156.6992998-8049-192474262026351/AnsiballZ_stat.py && sleep 0' 7554 1726853156.76442: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853156.76448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853156.76450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.76452: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.76458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853156.76460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.76508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853156.76511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853156.76515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853156.76570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853156.78366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853156.78395: stderr chunk (state=3): >>><<< 7554 1726853156.78398: stdout chunk (state=3): >>><<< 7554 1726853156.78414: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853156.78417: _low_level_execute_command(): starting 7554 1726853156.78422: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853156.6992998-8049-192474262026351/AnsiballZ_stat.py && sleep 0' 7554 1726853156.78860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.78864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.78877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.78930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853156.78937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853156.78939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853156.78997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853156.94606: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25123, "dev": 23, "nlink": 1, "atime": 1726853155.267833, "mtime": 1726853155.267833, "ctime": 1726853155.267833, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7554 1726853156.95964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853156.95996: stderr chunk (state=3): >>><<< 7554 1726853156.95999: stdout chunk (state=3): >>><<< 7554 1726853156.96014: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25123, "dev": 23, "nlink": 1, "atime": 1726853155.267833, "mtime": 1726853155.267833, "ctime": 1726853155.267833, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853156.96055: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853156.6992998-8049-192474262026351/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853156.96064: _low_level_execute_command(): starting 7554 1726853156.96066: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853156.6992998-8049-192474262026351/ > /dev/null 2>&1 && sleep 0' 7554 1726853156.96515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853156.96518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853156.96520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.96523: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853156.96525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853156.96577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853156.96585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853156.96588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853156.96648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853156.98497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853156.98522: stderr chunk (state=3): >>><<< 7554 1726853156.98525: stdout chunk (state=3): >>><<< 7554 1726853156.98538: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853156.98544: handler run complete 7554 1726853156.98579: attempt loop complete, returning result 7554 1726853156.98582: _execute() done 7554 1726853156.98585: dumping result to json 7554 1726853156.98589: done dumping result, returning 7554 1726853156.98596: done running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 [02083763-bbaf-bdc3-98b6-0000000007ee] 7554 1726853156.98601: sending task result for task 02083763-bbaf-bdc3-98b6-0000000007ee 7554 1726853156.98707: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000007ee 7554 1726853156.98709: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726853155.267833, "block_size": 4096, "blocks": 0, "ctime": 1726853155.267833, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 25123, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1726853155.267833, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 7554 1726853156.98795: no more pending results, returning what we have 7554 1726853156.98798: results queue empty 7554 1726853156.98799: checking for any_errors_fatal 7554 1726853156.98800: done checking for any_errors_fatal 7554 1726853156.98801: checking for max_fail_percentage 7554 1726853156.98802: done checking for max_fail_percentage 7554 1726853156.98803: checking to see if all hosts have failed and the running result is not ok 7554 1726853156.98804: done checking to see if all hosts have failed 7554 1726853156.98805: getting the remaining hosts for this loop 7554 1726853156.98806: done getting the remaining hosts for this loop 7554 1726853156.98809: getting the next task for host managed_node3 7554 1726853156.98816: done getting next task for host managed_node3 7554 1726853156.98818: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 7554 1726853156.98821: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853156.98824: getting variables 7554 1726853156.98826: in VariableManager get_vars() 7554 1726853156.98877: Calling all_inventory to load vars for managed_node3 7554 1726853156.98880: Calling groups_inventory to load vars for managed_node3 7554 1726853156.98882: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853156.98891: Calling all_plugins_play to load vars for managed_node3 7554 1726853156.98894: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853156.98896: Calling groups_plugins_play to load vars for managed_node3 7554 1726853156.99021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853156.99139: done with get_vars() 7554 1726853156.99149: done getting variables 7554 1726853156.99221: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 7554 1726853156.99307: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:25:56 -0400 (0:00:00.333) 0:00:10.960 ****** 7554 1726853156.99330: entering _queue_task() for managed_node3/assert 7554 1726853156.99334: Creating lock for assert 7554 1726853156.99525: worker is 1 (out of 1 available) 7554 1726853156.99538: exiting _queue_task() for managed_node3/assert 7554 1726853156.99551: done queuing things up, now waiting for results queue to drain 7554 1726853156.99553: waiting for pending results... 7554 1726853156.99712: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' 7554 1726853156.99765: in run() - task 02083763-bbaf-bdc3-98b6-0000000005f6 7554 1726853156.99781: variable 'ansible_search_path' from source: unknown 7554 1726853156.99785: variable 'ansible_search_path' from source: unknown 7554 1726853156.99811: calling self._execute() 7554 1726853156.99873: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853156.99876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853156.99888: variable 'omit' from source: magic vars 7554 1726853157.00394: variable 'ansible_distribution_major_version' from source: facts 7554 1726853157.00403: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853157.00409: variable 'omit' from source: magic vars 7554 1726853157.00437: variable 'omit' from source: magic vars 7554 1726853157.00502: variable 'interface' from source: play vars 7554 1726853157.00515: variable 'omit' from source: magic vars 7554 1726853157.00549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853157.00574: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853157.00590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853157.00602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853157.00612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853157.00635: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853157.00638: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853157.00642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853157.00709: Set connection var ansible_shell_executable to /bin/sh 7554 1726853157.00716: Set connection var ansible_pipelining to False 7554 1726853157.00718: Set connection var ansible_shell_type to sh 7554 1726853157.00721: Set connection var ansible_connection to ssh 7554 1726853157.00729: Set connection var ansible_timeout to 10 7554 1726853157.00733: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853157.00762: variable 'ansible_shell_executable' from source: unknown 7554 1726853157.00765: variable 'ansible_connection' from source: unknown 7554 1726853157.00769: variable 'ansible_module_compression' from source: unknown 7554 1726853157.00773: variable 'ansible_shell_type' from source: unknown 7554 1726853157.00775: variable 'ansible_shell_executable' from source: unknown 7554 1726853157.00777: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853157.00779: variable 'ansible_pipelining' from source: unknown 7554 1726853157.00781: variable 'ansible_timeout' from source: unknown 7554 1726853157.00783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853157.00861: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853157.00869: variable 'omit' from source: magic vars 7554 1726853157.00880: starting attempt loop 7554 1726853157.00884: running the handler 7554 1726853157.00969: variable 'interface_stat' from source: set_fact 7554 1726853157.00988: Evaluated conditional (interface_stat.stat.exists): True 7554 1726853157.00991: handler run complete 7554 1726853157.01002: attempt loop complete, returning result 7554 1726853157.01005: _execute() done 7554 1726853157.01008: dumping result to json 7554 1726853157.01011: done dumping result, returning 7554 1726853157.01014: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' [02083763-bbaf-bdc3-98b6-0000000005f6] 7554 1726853157.01020: sending task result for task 02083763-bbaf-bdc3-98b6-0000000005f6 7554 1726853157.01094: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000005f6 7554 1726853157.01097: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7554 1726853157.01138: no more pending results, returning what we have 7554 1726853157.01141: results queue empty 7554 1726853157.01142: checking for any_errors_fatal 7554 1726853157.01147: done checking for any_errors_fatal 7554 1726853157.01148: checking for max_fail_percentage 7554 1726853157.01150: done checking for max_fail_percentage 7554 1726853157.01150: checking to see if all hosts have failed and the running result is not ok 7554 1726853157.01151: done checking to see if all hosts have failed 7554 1726853157.01152: getting the remaining hosts for this loop 7554 1726853157.01153: done getting the remaining hosts for this loop 7554 1726853157.01156: getting the next task for host managed_node3 7554 1726853157.01162: done getting next task for host managed_node3 7554 1726853157.01165: ^ task is: TASK: TEST: I can configure an interface with auto_gateway enabled 7554 1726853157.01167: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853157.01170: getting variables 7554 1726853157.01173: in VariableManager get_vars() 7554 1726853157.01212: Calling all_inventory to load vars for managed_node3 7554 1726853157.01214: Calling groups_inventory to load vars for managed_node3 7554 1726853157.01216: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853157.01224: Calling all_plugins_play to load vars for managed_node3 7554 1726853157.01227: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853157.01229: Calling groups_plugins_play to load vars for managed_node3 7554 1726853157.01534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853157.01641: done with get_vars() 7554 1726853157.01648: done getting variables 7554 1726853157.01685: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with auto_gateway enabled] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:17 Friday 20 September 2024 13:25:57 -0400 (0:00:00.023) 0:00:10.984 ****** 7554 1726853157.01704: entering _queue_task() for managed_node3/debug 7554 1726853157.01869: worker is 1 (out of 1 available) 7554 1726853157.01883: exiting _queue_task() for managed_node3/debug 7554 1726853157.01893: done queuing things up, now waiting for results queue to drain 7554 1726853157.01895: waiting for pending results... 7554 1726853157.02048: running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway enabled 7554 1726853157.02100: in run() - task 02083763-bbaf-bdc3-98b6-00000000000e 7554 1726853157.02111: variable 'ansible_search_path' from source: unknown 7554 1726853157.02139: calling self._execute() 7554 1726853157.02200: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853157.02206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853157.02215: variable 'omit' from source: magic vars 7554 1726853157.02468: variable 'ansible_distribution_major_version' from source: facts 7554 1726853157.02479: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853157.02484: variable 'omit' from source: magic vars 7554 1726853157.02497: variable 'omit' from source: magic vars 7554 1726853157.02519: variable 'omit' from source: magic vars 7554 1726853157.02558: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853157.02586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853157.02601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853157.02615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853157.02624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853157.02645: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853157.02651: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853157.02654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853157.02722: Set connection var ansible_shell_executable to /bin/sh 7554 1726853157.02728: Set connection var ansible_pipelining to False 7554 1726853157.02731: Set connection var ansible_shell_type to sh 7554 1726853157.02734: Set connection var ansible_connection to ssh 7554 1726853157.02741: Set connection var ansible_timeout to 10 7554 1726853157.02746: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853157.02764: variable 'ansible_shell_executable' from source: unknown 7554 1726853157.02768: variable 'ansible_connection' from source: unknown 7554 1726853157.02773: variable 'ansible_module_compression' from source: unknown 7554 1726853157.02776: variable 'ansible_shell_type' from source: unknown 7554 1726853157.02778: variable 'ansible_shell_executable' from source: unknown 7554 1726853157.02782: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853157.02785: variable 'ansible_pipelining' from source: unknown 7554 1726853157.02787: variable 'ansible_timeout' from source: unknown 7554 1726853157.02789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853157.02880: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853157.02889: variable 'omit' from source: magic vars 7554 1726853157.02894: starting attempt loop 7554 1726853157.02896: running the handler 7554 1726853157.02933: handler run complete 7554 1726853157.02945: attempt loop complete, returning result 7554 1726853157.02949: _execute() done 7554 1726853157.02952: dumping result to json 7554 1726853157.02954: done dumping result, returning 7554 1726853157.02961: done running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway enabled [02083763-bbaf-bdc3-98b6-00000000000e] 7554 1726853157.02966: sending task result for task 02083763-bbaf-bdc3-98b6-00000000000e ok: [managed_node3] => {} MSG: ################################################## 7554 1726853157.03090: no more pending results, returning what we have 7554 1726853157.03093: results queue empty 7554 1726853157.03094: checking for any_errors_fatal 7554 1726853157.03099: done checking for any_errors_fatal 7554 1726853157.03099: checking for max_fail_percentage 7554 1726853157.03101: done checking for max_fail_percentage 7554 1726853157.03102: checking to see if all hosts have failed and the running result is not ok 7554 1726853157.03103: done checking to see if all hosts have failed 7554 1726853157.03103: getting the remaining hosts for this loop 7554 1726853157.03105: done getting the remaining hosts for this loop 7554 1726853157.03107: getting the next task for host managed_node3 7554 1726853157.03112: done getting next task for host managed_node3 7554 1726853157.03118: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7554 1726853157.03120: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853157.03133: getting variables 7554 1726853157.03134: in VariableManager get_vars() 7554 1726853157.03170: Calling all_inventory to load vars for managed_node3 7554 1726853157.03174: Calling groups_inventory to load vars for managed_node3 7554 1726853157.03176: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853157.03184: Calling all_plugins_play to load vars for managed_node3 7554 1726853157.03187: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853157.03190: Calling groups_plugins_play to load vars for managed_node3 7554 1726853157.03292: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000000e 7554 1726853157.03296: WORKER PROCESS EXITING 7554 1726853157.03307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853157.03434: done with get_vars() 7554 1726853157.03440: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:25:57 -0400 (0:00:00.017) 0:00:11.002 ****** 7554 1726853157.03500: entering _queue_task() for managed_node3/include_tasks 7554 1726853157.03663: worker is 1 (out of 1 available) 7554 1726853157.03676: exiting _queue_task() for managed_node3/include_tasks 7554 1726853157.03686: done queuing things up, now waiting for results queue to drain 7554 1726853157.03688: waiting for pending results... 7554 1726853157.03836: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7554 1726853157.03920: in run() - task 02083763-bbaf-bdc3-98b6-000000000016 7554 1726853157.03927: variable 'ansible_search_path' from source: unknown 7554 1726853157.03930: variable 'ansible_search_path' from source: unknown 7554 1726853157.03957: calling self._execute() 7554 1726853157.04018: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853157.04030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853157.04034: variable 'omit' from source: magic vars 7554 1726853157.04335: variable 'ansible_distribution_major_version' from source: facts 7554 1726853157.04344: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853157.04351: _execute() done 7554 1726853157.04355: dumping result to json 7554 1726853157.04358: done dumping result, returning 7554 1726853157.04361: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-bdc3-98b6-000000000016] 7554 1726853157.04373: sending task result for task 02083763-bbaf-bdc3-98b6-000000000016 7554 1726853157.04442: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000016 7554 1726853157.04448: WORKER PROCESS EXITING 7554 1726853157.04504: no more pending results, returning what we have 7554 1726853157.04507: in VariableManager get_vars() 7554 1726853157.04549: Calling all_inventory to load vars for managed_node3 7554 1726853157.04552: Calling groups_inventory to load vars for managed_node3 7554 1726853157.04554: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853157.04561: Calling all_plugins_play to load vars for managed_node3 7554 1726853157.04564: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853157.04566: Calling groups_plugins_play to load vars for managed_node3 7554 1726853157.04701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853157.04815: done with get_vars() 7554 1726853157.04820: variable 'ansible_search_path' from source: unknown 7554 1726853157.04821: variable 'ansible_search_path' from source: unknown 7554 1726853157.04847: we have included files to process 7554 1726853157.04848: generating all_blocks data 7554 1726853157.04849: done generating all_blocks data 7554 1726853157.04851: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7554 1726853157.04852: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7554 1726853157.04853: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7554 1726853157.05295: done processing included file 7554 1726853157.05296: iterating over new_blocks loaded from include file 7554 1726853157.05297: in VariableManager get_vars() 7554 1726853157.05314: done with get_vars() 7554 1726853157.05315: filtering new block on tags 7554 1726853157.05327: done filtering new block on tags 7554 1726853157.05329: in VariableManager get_vars() 7554 1726853157.05347: done with get_vars() 7554 1726853157.05348: filtering new block on tags 7554 1726853157.05360: done filtering new block on tags 7554 1726853157.05362: in VariableManager get_vars() 7554 1726853157.05379: done with get_vars() 7554 1726853157.05380: filtering new block on tags 7554 1726853157.05390: done filtering new block on tags 7554 1726853157.05392: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7554 1726853157.05395: extending task lists for all hosts with included blocks 7554 1726853157.05856: done extending task lists 7554 1726853157.05857: done processing included files 7554 1726853157.05858: results queue empty 7554 1726853157.05858: checking for any_errors_fatal 7554 1726853157.05861: done checking for any_errors_fatal 7554 1726853157.05861: checking for max_fail_percentage 7554 1726853157.05862: done checking for max_fail_percentage 7554 1726853157.05863: checking to see if all hosts have failed and the running result is not ok 7554 1726853157.05864: done checking to see if all hosts have failed 7554 1726853157.05864: getting the remaining hosts for this loop 7554 1726853157.05865: done getting the remaining hosts for this loop 7554 1726853157.05867: getting the next task for host managed_node3 7554 1726853157.05869: done getting next task for host managed_node3 7554 1726853157.05872: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7554 1726853157.05874: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853157.05880: getting variables 7554 1726853157.05880: in VariableManager get_vars() 7554 1726853157.05891: Calling all_inventory to load vars for managed_node3 7554 1726853157.05893: Calling groups_inventory to load vars for managed_node3 7554 1726853157.05894: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853157.05897: Calling all_plugins_play to load vars for managed_node3 7554 1726853157.05898: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853157.05900: Calling groups_plugins_play to load vars for managed_node3 7554 1726853157.05980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853157.06093: done with get_vars() 7554 1726853157.06099: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:25:57 -0400 (0:00:00.026) 0:00:11.028 ****** 7554 1726853157.06142: entering _queue_task() for managed_node3/setup 7554 1726853157.06310: worker is 1 (out of 1 available) 7554 1726853157.06322: exiting _queue_task() for managed_node3/setup 7554 1726853157.06332: done queuing things up, now waiting for results queue to drain 7554 1726853157.06334: waiting for pending results... 7554 1726853157.06483: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7554 1726853157.06568: in run() - task 02083763-bbaf-bdc3-98b6-000000000809 7554 1726853157.06577: variable 'ansible_search_path' from source: unknown 7554 1726853157.06580: variable 'ansible_search_path' from source: unknown 7554 1726853157.06606: calling self._execute() 7554 1726853157.06662: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853157.06668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853157.06681: variable 'omit' from source: magic vars 7554 1726853157.06909: variable 'ansible_distribution_major_version' from source: facts 7554 1726853157.06918: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853157.07077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853157.08435: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853157.08485: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853157.08511: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853157.08539: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853157.08559: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853157.08614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853157.08634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853157.08660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853157.08687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853157.08698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853157.08734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853157.08757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853157.08775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853157.08799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853157.08810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853157.08907: variable '__network_required_facts' from source: role '' defaults 7554 1726853157.08914: variable 'ansible_facts' from source: unknown 7554 1726853157.08973: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7554 1726853157.08977: when evaluation is False, skipping this task 7554 1726853157.08980: _execute() done 7554 1726853157.08982: dumping result to json 7554 1726853157.08985: done dumping result, returning 7554 1726853157.08990: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-bdc3-98b6-000000000809] 7554 1726853157.08995: sending task result for task 02083763-bbaf-bdc3-98b6-000000000809 7554 1726853157.09070: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000809 7554 1726853157.09074: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853157.09115: no more pending results, returning what we have 7554 1726853157.09119: results queue empty 7554 1726853157.09119: checking for any_errors_fatal 7554 1726853157.09120: done checking for any_errors_fatal 7554 1726853157.09121: checking for max_fail_percentage 7554 1726853157.09122: done checking for max_fail_percentage 7554 1726853157.09123: checking to see if all hosts have failed and the running result is not ok 7554 1726853157.09124: done checking to see if all hosts have failed 7554 1726853157.09125: getting the remaining hosts for this loop 7554 1726853157.09126: done getting the remaining hosts for this loop 7554 1726853157.09129: getting the next task for host managed_node3 7554 1726853157.09136: done getting next task for host managed_node3 7554 1726853157.09139: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7554 1726853157.09143: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853157.09154: getting variables 7554 1726853157.09156: in VariableManager get_vars() 7554 1726853157.09194: Calling all_inventory to load vars for managed_node3 7554 1726853157.09196: Calling groups_inventory to load vars for managed_node3 7554 1726853157.09198: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853157.09205: Calling all_plugins_play to load vars for managed_node3 7554 1726853157.09208: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853157.09210: Calling groups_plugins_play to load vars for managed_node3 7554 1726853157.09345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853157.09476: done with get_vars() 7554 1726853157.09483: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:25:57 -0400 (0:00:00.033) 0:00:11.062 ****** 7554 1726853157.09544: entering _queue_task() for managed_node3/stat 7554 1726853157.09708: worker is 1 (out of 1 available) 7554 1726853157.09721: exiting _queue_task() for managed_node3/stat 7554 1726853157.09731: done queuing things up, now waiting for results queue to drain 7554 1726853157.09733: waiting for pending results... 7554 1726853157.09894: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7554 1726853157.09986: in run() - task 02083763-bbaf-bdc3-98b6-00000000080b 7554 1726853157.09998: variable 'ansible_search_path' from source: unknown 7554 1726853157.10002: variable 'ansible_search_path' from source: unknown 7554 1726853157.10027: calling self._execute() 7554 1726853157.10086: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853157.10092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853157.10100: variable 'omit' from source: magic vars 7554 1726853157.10350: variable 'ansible_distribution_major_version' from source: facts 7554 1726853157.10357: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853157.10468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853157.10655: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853157.10687: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853157.10710: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853157.10757: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853157.10814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853157.10831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853157.10856: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853157.10875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853157.10931: variable '__network_is_ostree' from source: set_fact 7554 1726853157.10942: Evaluated conditional (not __network_is_ostree is defined): False 7554 1726853157.10948: when evaluation is False, skipping this task 7554 1726853157.10950: _execute() done 7554 1726853157.10953: dumping result to json 7554 1726853157.10956: done dumping result, returning 7554 1726853157.10959: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-bdc3-98b6-00000000080b] 7554 1726853157.10961: sending task result for task 02083763-bbaf-bdc3-98b6-00000000080b 7554 1726853157.11036: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000080b 7554 1726853157.11038: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7554 1726853157.11090: no more pending results, returning what we have 7554 1726853157.11092: results queue empty 7554 1726853157.11093: checking for any_errors_fatal 7554 1726853157.11100: done checking for any_errors_fatal 7554 1726853157.11101: checking for max_fail_percentage 7554 1726853157.11102: done checking for max_fail_percentage 7554 1726853157.11102: checking to see if all hosts have failed and the running result is not ok 7554 1726853157.11104: done checking to see if all hosts have failed 7554 1726853157.11104: getting the remaining hosts for this loop 7554 1726853157.11105: done getting the remaining hosts for this loop 7554 1726853157.11108: getting the next task for host managed_node3 7554 1726853157.11113: done getting next task for host managed_node3 7554 1726853157.11116: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7554 1726853157.11119: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853157.11130: getting variables 7554 1726853157.11131: in VariableManager get_vars() 7554 1726853157.11169: Calling all_inventory to load vars for managed_node3 7554 1726853157.11173: Calling groups_inventory to load vars for managed_node3 7554 1726853157.11175: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853157.11183: Calling all_plugins_play to load vars for managed_node3 7554 1726853157.11185: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853157.11187: Calling groups_plugins_play to load vars for managed_node3 7554 1726853157.11291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853157.11411: done with get_vars() 7554 1726853157.11417: done getting variables 7554 1726853157.11455: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:25:57 -0400 (0:00:00.019) 0:00:11.082 ****** 7554 1726853157.11478: entering _queue_task() for managed_node3/set_fact 7554 1726853157.11636: worker is 1 (out of 1 available) 7554 1726853157.11651: exiting _queue_task() for managed_node3/set_fact 7554 1726853157.11661: done queuing things up, now waiting for results queue to drain 7554 1726853157.11662: waiting for pending results... 7554 1726853157.11801: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7554 1726853157.11885: in run() - task 02083763-bbaf-bdc3-98b6-00000000080c 7554 1726853157.11899: variable 'ansible_search_path' from source: unknown 7554 1726853157.11902: variable 'ansible_search_path' from source: unknown 7554 1726853157.11924: calling self._execute() 7554 1726853157.11980: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853157.11984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853157.11993: variable 'omit' from source: magic vars 7554 1726853157.12274: variable 'ansible_distribution_major_version' from source: facts 7554 1726853157.12284: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853157.12388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853157.12558: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853157.12589: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853157.12613: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853157.12636: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853157.12694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853157.12711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853157.12728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853157.12747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853157.12805: variable '__network_is_ostree' from source: set_fact 7554 1726853157.12809: Evaluated conditional (not __network_is_ostree is defined): False 7554 1726853157.12812: when evaluation is False, skipping this task 7554 1726853157.12814: _execute() done 7554 1726853157.12817: dumping result to json 7554 1726853157.12821: done dumping result, returning 7554 1726853157.12827: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-bdc3-98b6-00000000080c] 7554 1726853157.12832: sending task result for task 02083763-bbaf-bdc3-98b6-00000000080c 7554 1726853157.12935: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000080c 7554 1726853157.12938: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7554 1726853157.13010: no more pending results, returning what we have 7554 1726853157.13013: results queue empty 7554 1726853157.13013: checking for any_errors_fatal 7554 1726853157.13017: done checking for any_errors_fatal 7554 1726853157.13017: checking for max_fail_percentage 7554 1726853157.13018: done checking for max_fail_percentage 7554 1726853157.13019: checking to see if all hosts have failed and the running result is not ok 7554 1726853157.13020: done checking to see if all hosts have failed 7554 1726853157.13020: getting the remaining hosts for this loop 7554 1726853157.13021: done getting the remaining hosts for this loop 7554 1726853157.13024: getting the next task for host managed_node3 7554 1726853157.13029: done getting next task for host managed_node3 7554 1726853157.13031: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7554 1726853157.13034: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853157.13042: getting variables 7554 1726853157.13043: in VariableManager get_vars() 7554 1726853157.13076: Calling all_inventory to load vars for managed_node3 7554 1726853157.13078: Calling groups_inventory to load vars for managed_node3 7554 1726853157.13116: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853157.13122: Calling all_plugins_play to load vars for managed_node3 7554 1726853157.13123: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853157.13125: Calling groups_plugins_play to load vars for managed_node3 7554 1726853157.13220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853157.13339: done with get_vars() 7554 1726853157.13348: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:25:57 -0400 (0:00:00.019) 0:00:11.101 ****** 7554 1726853157.13407: entering _queue_task() for managed_node3/service_facts 7554 1726853157.13408: Creating lock for service_facts 7554 1726853157.13581: worker is 1 (out of 1 available) 7554 1726853157.13595: exiting _queue_task() for managed_node3/service_facts 7554 1726853157.13605: done queuing things up, now waiting for results queue to drain 7554 1726853157.13607: waiting for pending results... 7554 1726853157.13748: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7554 1726853157.13836: in run() - task 02083763-bbaf-bdc3-98b6-00000000080e 7554 1726853157.13843: variable 'ansible_search_path' from source: unknown 7554 1726853157.13849: variable 'ansible_search_path' from source: unknown 7554 1726853157.13874: calling self._execute() 7554 1726853157.13927: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853157.13931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853157.13940: variable 'omit' from source: magic vars 7554 1726853157.14194: variable 'ansible_distribution_major_version' from source: facts 7554 1726853157.14200: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853157.14206: variable 'omit' from source: magic vars 7554 1726853157.14253: variable 'omit' from source: magic vars 7554 1726853157.14280: variable 'omit' from source: magic vars 7554 1726853157.14309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853157.14333: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853157.14351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853157.14364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853157.14379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853157.14398: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853157.14401: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853157.14405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853157.14473: Set connection var ansible_shell_executable to /bin/sh 7554 1726853157.14480: Set connection var ansible_pipelining to False 7554 1726853157.14483: Set connection var ansible_shell_type to sh 7554 1726853157.14485: Set connection var ansible_connection to ssh 7554 1726853157.14494: Set connection var ansible_timeout to 10 7554 1726853157.14499: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853157.14516: variable 'ansible_shell_executable' from source: unknown 7554 1726853157.14519: variable 'ansible_connection' from source: unknown 7554 1726853157.14522: variable 'ansible_module_compression' from source: unknown 7554 1726853157.14524: variable 'ansible_shell_type' from source: unknown 7554 1726853157.14526: variable 'ansible_shell_executable' from source: unknown 7554 1726853157.14528: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853157.14530: variable 'ansible_pipelining' from source: unknown 7554 1726853157.14533: variable 'ansible_timeout' from source: unknown 7554 1726853157.14539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853157.14670: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853157.14681: variable 'omit' from source: magic vars 7554 1726853157.14684: starting attempt loop 7554 1726853157.14687: running the handler 7554 1726853157.14698: _low_level_execute_command(): starting 7554 1726853157.14711: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853157.15207: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853157.15211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853157.15214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853157.15216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853157.15269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853157.15275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853157.15277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853157.15351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853157.17036: stdout chunk (state=3): >>>/root <<< 7554 1726853157.17132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853157.17159: stderr chunk (state=3): >>><<< 7554 1726853157.17163: stdout chunk (state=3): >>><<< 7554 1726853157.17182: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853157.17193: _low_level_execute_command(): starting 7554 1726853157.17198: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853157.1718187-8063-46224095805854 `" && echo ansible-tmp-1726853157.1718187-8063-46224095805854="` echo /root/.ansible/tmp/ansible-tmp-1726853157.1718187-8063-46224095805854 `" ) && sleep 0' 7554 1726853157.17612: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853157.17615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853157.17618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853157.17626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853157.17628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853157.17674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853157.17678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853157.17741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853157.19683: stdout chunk (state=3): >>>ansible-tmp-1726853157.1718187-8063-46224095805854=/root/.ansible/tmp/ansible-tmp-1726853157.1718187-8063-46224095805854 <<< 7554 1726853157.19790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853157.19812: stderr chunk (state=3): >>><<< 7554 1726853157.19816: stdout chunk (state=3): >>><<< 7554 1726853157.19831: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853157.1718187-8063-46224095805854=/root/.ansible/tmp/ansible-tmp-1726853157.1718187-8063-46224095805854 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853157.19865: variable 'ansible_module_compression' from source: unknown 7554 1726853157.19902: ANSIBALLZ: Using lock for service_facts 7554 1726853157.19906: ANSIBALLZ: Acquiring lock 7554 1726853157.19908: ANSIBALLZ: Lock acquired: 140257824544800 7554 1726853157.19910: ANSIBALLZ: Creating module 7554 1726853157.27706: ANSIBALLZ: Writing module into payload 7554 1726853157.27773: ANSIBALLZ: Writing module 7554 1726853157.27794: ANSIBALLZ: Renaming module 7554 1726853157.27799: ANSIBALLZ: Done creating module 7554 1726853157.27814: variable 'ansible_facts' from source: unknown 7554 1726853157.27864: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853157.1718187-8063-46224095805854/AnsiballZ_service_facts.py 7554 1726853157.27964: Sending initial data 7554 1726853157.27967: Sent initial data (159 bytes) 7554 1726853157.28442: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853157.28448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853157.28450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853157.28452: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853157.28456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853157.28505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853157.28508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853157.28511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853157.28579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853157.30244: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7554 1726853157.30250: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853157.30303: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853157.30361: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpuk2k1u7k /root/.ansible/tmp/ansible-tmp-1726853157.1718187-8063-46224095805854/AnsiballZ_service_facts.py <<< 7554 1726853157.30369: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853157.1718187-8063-46224095805854/AnsiballZ_service_facts.py" <<< 7554 1726853157.30424: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 7554 1726853157.30427: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpuk2k1u7k" to remote "/root/.ansible/tmp/ansible-tmp-1726853157.1718187-8063-46224095805854/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853157.1718187-8063-46224095805854/AnsiballZ_service_facts.py" <<< 7554 1726853157.31062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853157.31106: stderr chunk (state=3): >>><<< 7554 1726853157.31109: stdout chunk (state=3): >>><<< 7554 1726853157.31172: done transferring module to remote 7554 1726853157.31182: _low_level_execute_command(): starting 7554 1726853157.31186: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853157.1718187-8063-46224095805854/ /root/.ansible/tmp/ansible-tmp-1726853157.1718187-8063-46224095805854/AnsiballZ_service_facts.py && sleep 0' 7554 1726853157.31628: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853157.31631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853157.31637: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853157.31639: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853157.31641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853157.31691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853157.31695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853157.31758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853157.33568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853157.33592: stderr chunk (state=3): >>><<< 7554 1726853157.33595: stdout chunk (state=3): >>><<< 7554 1726853157.33608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853157.33611: _low_level_execute_command(): starting 7554 1726853157.33615: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853157.1718187-8063-46224095805854/AnsiballZ_service_facts.py && sleep 0' 7554 1726853157.34047: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853157.34052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853157.34054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853157.34058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853157.34060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853157.34109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853157.34112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853157.34187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853159.08444: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "sy<<< 7554 1726853159.08457: stdout chunk (state=3): >>>stemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {<<< 7554 1726853159.08467: stdout chunk (state=3): >>>"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7554 1726853159.10061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853159.10065: stdout chunk (state=3): >>><<< 7554 1726853159.10068: stderr chunk (state=3): >>><<< 7554 1726853159.10279: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853159.10808: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853157.1718187-8063-46224095805854/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853159.10824: _low_level_execute_command(): starting 7554 1726853159.10843: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853157.1718187-8063-46224095805854/ > /dev/null 2>&1 && sleep 0' 7554 1726853159.11591: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853159.11662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853159.11685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853159.11742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853159.11811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853159.13725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853159.13749: stdout chunk (state=3): >>><<< 7554 1726853159.13841: stderr chunk (state=3): >>><<< 7554 1726853159.13847: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853159.13850: handler run complete 7554 1726853159.14034: variable 'ansible_facts' from source: unknown 7554 1726853159.14229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853159.14775: variable 'ansible_facts' from source: unknown 7554 1726853159.15966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853159.16210: attempt loop complete, returning result 7554 1726853159.16222: _execute() done 7554 1726853159.16230: dumping result to json 7554 1726853159.16298: done dumping result, returning 7554 1726853159.16377: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-bdc3-98b6-00000000080e] 7554 1726853159.16381: sending task result for task 02083763-bbaf-bdc3-98b6-00000000080e 7554 1726853159.17300: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000080e 7554 1726853159.17303: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853159.17368: no more pending results, returning what we have 7554 1726853159.17419: results queue empty 7554 1726853159.17420: checking for any_errors_fatal 7554 1726853159.17427: done checking for any_errors_fatal 7554 1726853159.17428: checking for max_fail_percentage 7554 1726853159.17429: done checking for max_fail_percentage 7554 1726853159.17430: checking to see if all hosts have failed and the running result is not ok 7554 1726853159.17431: done checking to see if all hosts have failed 7554 1726853159.17432: getting the remaining hosts for this loop 7554 1726853159.17433: done getting the remaining hosts for this loop 7554 1726853159.17437: getting the next task for host managed_node3 7554 1726853159.17443: done getting next task for host managed_node3 7554 1726853159.17449: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7554 1726853159.17453: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853159.17463: getting variables 7554 1726853159.17465: in VariableManager get_vars() 7554 1726853159.17510: Calling all_inventory to load vars for managed_node3 7554 1726853159.17513: Calling groups_inventory to load vars for managed_node3 7554 1726853159.17516: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853159.17640: Calling all_plugins_play to load vars for managed_node3 7554 1726853159.17647: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853159.17651: Calling groups_plugins_play to load vars for managed_node3 7554 1726853159.18113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853159.18665: done with get_vars() 7554 1726853159.18681: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:25:59 -0400 (0:00:02.053) 0:00:13.155 ****** 7554 1726853159.18795: entering _queue_task() for managed_node3/package_facts 7554 1726853159.18797: Creating lock for package_facts 7554 1726853159.19150: worker is 1 (out of 1 available) 7554 1726853159.19277: exiting _queue_task() for managed_node3/package_facts 7554 1726853159.19288: done queuing things up, now waiting for results queue to drain 7554 1726853159.19289: waiting for pending results... 7554 1726853159.19699: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7554 1726853159.19705: in run() - task 02083763-bbaf-bdc3-98b6-00000000080f 7554 1726853159.19708: variable 'ansible_search_path' from source: unknown 7554 1726853159.19711: variable 'ansible_search_path' from source: unknown 7554 1726853159.19776: calling self._execute() 7554 1726853159.19829: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853159.19841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853159.19858: variable 'omit' from source: magic vars 7554 1726853159.20250: variable 'ansible_distribution_major_version' from source: facts 7554 1726853159.20269: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853159.20342: variable 'omit' from source: magic vars 7554 1726853159.20364: variable 'omit' from source: magic vars 7554 1726853159.20405: variable 'omit' from source: magic vars 7554 1726853159.20456: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853159.20499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853159.20524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853159.20547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853159.20572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853159.20606: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853159.20614: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853159.20670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853159.20736: Set connection var ansible_shell_executable to /bin/sh 7554 1726853159.20753: Set connection var ansible_pipelining to False 7554 1726853159.20760: Set connection var ansible_shell_type to sh 7554 1726853159.20768: Set connection var ansible_connection to ssh 7554 1726853159.20791: Set connection var ansible_timeout to 10 7554 1726853159.20802: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853159.20831: variable 'ansible_shell_executable' from source: unknown 7554 1726853159.20838: variable 'ansible_connection' from source: unknown 7554 1726853159.20876: variable 'ansible_module_compression' from source: unknown 7554 1726853159.20884: variable 'ansible_shell_type' from source: unknown 7554 1726853159.20890: variable 'ansible_shell_executable' from source: unknown 7554 1726853159.20892: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853159.20895: variable 'ansible_pipelining' from source: unknown 7554 1726853159.20897: variable 'ansible_timeout' from source: unknown 7554 1726853159.20898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853159.21079: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853159.21087: variable 'omit' from source: magic vars 7554 1726853159.21092: starting attempt loop 7554 1726853159.21095: running the handler 7554 1726853159.21110: _low_level_execute_command(): starting 7554 1726853159.21118: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853159.21612: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853159.21615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853159.21617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853159.21620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853159.21622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853159.21674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853159.21678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853159.21747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853159.23453: stdout chunk (state=3): >>>/root <<< 7554 1726853159.23552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853159.23577: stderr chunk (state=3): >>><<< 7554 1726853159.23580: stdout chunk (state=3): >>><<< 7554 1726853159.23596: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853159.23610: _low_level_execute_command(): starting 7554 1726853159.23615: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853159.2359736-8102-188667437144970 `" && echo ansible-tmp-1726853159.2359736-8102-188667437144970="` echo /root/.ansible/tmp/ansible-tmp-1726853159.2359736-8102-188667437144970 `" ) && sleep 0' 7554 1726853159.24022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853159.24025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853159.24037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853159.24040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853159.24042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853159.24083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853159.24087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853159.24155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853159.26151: stdout chunk (state=3): >>>ansible-tmp-1726853159.2359736-8102-188667437144970=/root/.ansible/tmp/ansible-tmp-1726853159.2359736-8102-188667437144970 <<< 7554 1726853159.26218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853159.26259: stderr chunk (state=3): >>><<< 7554 1726853159.26270: stdout chunk (state=3): >>><<< 7554 1726853159.26287: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853159.2359736-8102-188667437144970=/root/.ansible/tmp/ansible-tmp-1726853159.2359736-8102-188667437144970 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853159.26341: variable 'ansible_module_compression' from source: unknown 7554 1726853159.26375: ANSIBALLZ: Using lock for package_facts 7554 1726853159.26381: ANSIBALLZ: Acquiring lock 7554 1726853159.26384: ANSIBALLZ: Lock acquired: 140257822569536 7554 1726853159.26388: ANSIBALLZ: Creating module 7554 1726853159.45676: ANSIBALLZ: Writing module into payload 7554 1726853159.45763: ANSIBALLZ: Writing module 7554 1726853159.45785: ANSIBALLZ: Renaming module 7554 1726853159.45791: ANSIBALLZ: Done creating module 7554 1726853159.45814: variable 'ansible_facts' from source: unknown 7554 1726853159.45911: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853159.2359736-8102-188667437144970/AnsiballZ_package_facts.py 7554 1726853159.46014: Sending initial data 7554 1726853159.46017: Sent initial data (160 bytes) 7554 1726853159.46476: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853159.46479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853159.46482: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853159.46484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853159.46536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853159.46539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853159.46541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853159.46612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853159.48290: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 7554 1726853159.48297: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853159.48350: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853159.48406: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpk0jner2w /root/.ansible/tmp/ansible-tmp-1726853159.2359736-8102-188667437144970/AnsiballZ_package_facts.py <<< 7554 1726853159.48413: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853159.2359736-8102-188667437144970/AnsiballZ_package_facts.py" <<< 7554 1726853159.48465: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpk0jner2w" to remote "/root/.ansible/tmp/ansible-tmp-1726853159.2359736-8102-188667437144970/AnsiballZ_package_facts.py" <<< 7554 1726853159.48468: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853159.2359736-8102-188667437144970/AnsiballZ_package_facts.py" <<< 7554 1726853159.49597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853159.49641: stderr chunk (state=3): >>><<< 7554 1726853159.49644: stdout chunk (state=3): >>><<< 7554 1726853159.49677: done transferring module to remote 7554 1726853159.49686: _low_level_execute_command(): starting 7554 1726853159.49691: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853159.2359736-8102-188667437144970/ /root/.ansible/tmp/ansible-tmp-1726853159.2359736-8102-188667437144970/AnsiballZ_package_facts.py && sleep 0' 7554 1726853159.50128: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853159.50131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853159.50137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853159.50139: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853159.50141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853159.50199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853159.50203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853159.50254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853159.52160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853159.52164: stdout chunk (state=3): >>><<< 7554 1726853159.52166: stderr chunk (state=3): >>><<< 7554 1726853159.52268: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853159.52274: _low_level_execute_command(): starting 7554 1726853159.52277: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853159.2359736-8102-188667437144970/AnsiballZ_package_facts.py && sleep 0' 7554 1726853159.52889: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853159.52939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853159.52955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853159.52981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853159.53079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853159.98096: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 7554 1726853159.98117: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 7554 1726853159.98121: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 7554 1726853159.98158: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 7554 1726853159.98175: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 7554 1726853159.98198: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 7554 1726853159.98212: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 7554 1726853159.98244: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 7554 1726853159.98252: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 7554 1726853159.98256: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 7554 1726853159.98287: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 7554 1726853159.98292: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 7554 1726853159.98307: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7554 1726853160.00195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853160.00227: stderr chunk (state=3): >>><<< 7554 1726853160.00230: stdout chunk (state=3): >>><<< 7554 1726853160.00266: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853160.01358: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853159.2359736-8102-188667437144970/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853160.01378: _low_level_execute_command(): starting 7554 1726853160.01382: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853159.2359736-8102-188667437144970/ > /dev/null 2>&1 && sleep 0' 7554 1726853160.01843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853160.01849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853160.01852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853160.01854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853160.01856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853160.01905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853160.01911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853160.01919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853160.01982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853160.03885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853160.03911: stderr chunk (state=3): >>><<< 7554 1726853160.03914: stdout chunk (state=3): >>><<< 7554 1726853160.03927: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853160.03933: handler run complete 7554 1726853160.07474: variable 'ansible_facts' from source: unknown 7554 1726853160.07697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853160.08710: variable 'ansible_facts' from source: unknown 7554 1726853160.08930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853160.09308: attempt loop complete, returning result 7554 1726853160.09318: _execute() done 7554 1726853160.09320: dumping result to json 7554 1726853160.09433: done dumping result, returning 7554 1726853160.09439: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-bdc3-98b6-00000000080f] 7554 1726853160.09442: sending task result for task 02083763-bbaf-bdc3-98b6-00000000080f 7554 1726853160.10784: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000080f 7554 1726853160.10787: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853160.10822: no more pending results, returning what we have 7554 1726853160.10826: results queue empty 7554 1726853160.10827: checking for any_errors_fatal 7554 1726853160.10831: done checking for any_errors_fatal 7554 1726853160.10832: checking for max_fail_percentage 7554 1726853160.10832: done checking for max_fail_percentage 7554 1726853160.10833: checking to see if all hosts have failed and the running result is not ok 7554 1726853160.10834: done checking to see if all hosts have failed 7554 1726853160.10834: getting the remaining hosts for this loop 7554 1726853160.10835: done getting the remaining hosts for this loop 7554 1726853160.10837: getting the next task for host managed_node3 7554 1726853160.10841: done getting next task for host managed_node3 7554 1726853160.10844: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7554 1726853160.10846: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853160.10853: getting variables 7554 1726853160.10854: in VariableManager get_vars() 7554 1726853160.10884: Calling all_inventory to load vars for managed_node3 7554 1726853160.10886: Calling groups_inventory to load vars for managed_node3 7554 1726853160.10887: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853160.10894: Calling all_plugins_play to load vars for managed_node3 7554 1726853160.10895: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853160.10897: Calling groups_plugins_play to load vars for managed_node3 7554 1726853160.11536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853160.12383: done with get_vars() 7554 1726853160.12397: done getting variables 7554 1726853160.12438: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:26:00 -0400 (0:00:00.936) 0:00:14.092 ****** 7554 1726853160.12466: entering _queue_task() for managed_node3/debug 7554 1726853160.12680: worker is 1 (out of 1 available) 7554 1726853160.12693: exiting _queue_task() for managed_node3/debug 7554 1726853160.12704: done queuing things up, now waiting for results queue to drain 7554 1726853160.12705: waiting for pending results... 7554 1726853160.12881: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7554 1726853160.12967: in run() - task 02083763-bbaf-bdc3-98b6-000000000017 7554 1726853160.12981: variable 'ansible_search_path' from source: unknown 7554 1726853160.12985: variable 'ansible_search_path' from source: unknown 7554 1726853160.13015: calling self._execute() 7554 1726853160.13087: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.13091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.13099: variable 'omit' from source: magic vars 7554 1726853160.13377: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.13387: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853160.13393: variable 'omit' from source: magic vars 7554 1726853160.13430: variable 'omit' from source: magic vars 7554 1726853160.13499: variable 'network_provider' from source: set_fact 7554 1726853160.13513: variable 'omit' from source: magic vars 7554 1726853160.13544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853160.13574: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853160.13592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853160.13605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853160.13615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853160.13637: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853160.13641: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.13643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.13715: Set connection var ansible_shell_executable to /bin/sh 7554 1726853160.13722: Set connection var ansible_pipelining to False 7554 1726853160.13725: Set connection var ansible_shell_type to sh 7554 1726853160.13728: Set connection var ansible_connection to ssh 7554 1726853160.13735: Set connection var ansible_timeout to 10 7554 1726853160.13739: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853160.13759: variable 'ansible_shell_executable' from source: unknown 7554 1726853160.13763: variable 'ansible_connection' from source: unknown 7554 1726853160.13766: variable 'ansible_module_compression' from source: unknown 7554 1726853160.13769: variable 'ansible_shell_type' from source: unknown 7554 1726853160.13773: variable 'ansible_shell_executable' from source: unknown 7554 1726853160.13775: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.13777: variable 'ansible_pipelining' from source: unknown 7554 1726853160.13780: variable 'ansible_timeout' from source: unknown 7554 1726853160.13782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.13881: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853160.13889: variable 'omit' from source: magic vars 7554 1726853160.13894: starting attempt loop 7554 1726853160.13897: running the handler 7554 1726853160.13932: handler run complete 7554 1726853160.13942: attempt loop complete, returning result 7554 1726853160.13945: _execute() done 7554 1726853160.13948: dumping result to json 7554 1726853160.13953: done dumping result, returning 7554 1726853160.13960: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-bdc3-98b6-000000000017] 7554 1726853160.13966: sending task result for task 02083763-bbaf-bdc3-98b6-000000000017 7554 1726853160.14039: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000017 7554 1726853160.14042: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 7554 1726853160.14094: no more pending results, returning what we have 7554 1726853160.14097: results queue empty 7554 1726853160.14098: checking for any_errors_fatal 7554 1726853160.14105: done checking for any_errors_fatal 7554 1726853160.14105: checking for max_fail_percentage 7554 1726853160.14107: done checking for max_fail_percentage 7554 1726853160.14107: checking to see if all hosts have failed and the running result is not ok 7554 1726853160.14108: done checking to see if all hosts have failed 7554 1726853160.14109: getting the remaining hosts for this loop 7554 1726853160.14110: done getting the remaining hosts for this loop 7554 1726853160.14114: getting the next task for host managed_node3 7554 1726853160.14119: done getting next task for host managed_node3 7554 1726853160.14122: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7554 1726853160.14125: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853160.14135: getting variables 7554 1726853160.14136: in VariableManager get_vars() 7554 1726853160.14176: Calling all_inventory to load vars for managed_node3 7554 1726853160.14179: Calling groups_inventory to load vars for managed_node3 7554 1726853160.14181: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853160.14188: Calling all_plugins_play to load vars for managed_node3 7554 1726853160.14190: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853160.14193: Calling groups_plugins_play to load vars for managed_node3 7554 1726853160.14962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853160.15816: done with get_vars() 7554 1726853160.15831: done getting variables 7554 1726853160.15870: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:26:00 -0400 (0:00:00.034) 0:00:14.126 ****** 7554 1726853160.15893: entering _queue_task() for managed_node3/fail 7554 1726853160.16075: worker is 1 (out of 1 available) 7554 1726853160.16089: exiting _queue_task() for managed_node3/fail 7554 1726853160.16100: done queuing things up, now waiting for results queue to drain 7554 1726853160.16102: waiting for pending results... 7554 1726853160.16265: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7554 1726853160.16356: in run() - task 02083763-bbaf-bdc3-98b6-000000000018 7554 1726853160.16367: variable 'ansible_search_path' from source: unknown 7554 1726853160.16372: variable 'ansible_search_path' from source: unknown 7554 1726853160.16400: calling self._execute() 7554 1726853160.16462: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.16466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.16477: variable 'omit' from source: magic vars 7554 1726853160.16731: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.16741: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853160.16825: variable 'network_state' from source: role '' defaults 7554 1726853160.16833: Evaluated conditional (network_state != {}): False 7554 1726853160.16837: when evaluation is False, skipping this task 7554 1726853160.16840: _execute() done 7554 1726853160.16843: dumping result to json 7554 1726853160.16845: done dumping result, returning 7554 1726853160.16853: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-bdc3-98b6-000000000018] 7554 1726853160.16858: sending task result for task 02083763-bbaf-bdc3-98b6-000000000018 7554 1726853160.16941: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000018 7554 1726853160.16943: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853160.17012: no more pending results, returning what we have 7554 1726853160.17014: results queue empty 7554 1726853160.17015: checking for any_errors_fatal 7554 1726853160.17019: done checking for any_errors_fatal 7554 1726853160.17019: checking for max_fail_percentage 7554 1726853160.17022: done checking for max_fail_percentage 7554 1726853160.17023: checking to see if all hosts have failed and the running result is not ok 7554 1726853160.17023: done checking to see if all hosts have failed 7554 1726853160.17024: getting the remaining hosts for this loop 7554 1726853160.17025: done getting the remaining hosts for this loop 7554 1726853160.17028: getting the next task for host managed_node3 7554 1726853160.17033: done getting next task for host managed_node3 7554 1726853160.17036: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7554 1726853160.17038: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853160.17050: getting variables 7554 1726853160.17051: in VariableManager get_vars() 7554 1726853160.17089: Calling all_inventory to load vars for managed_node3 7554 1726853160.17091: Calling groups_inventory to load vars for managed_node3 7554 1726853160.17092: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853160.17098: Calling all_plugins_play to load vars for managed_node3 7554 1726853160.17099: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853160.17101: Calling groups_plugins_play to load vars for managed_node3 7554 1726853160.17794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853160.18646: done with get_vars() 7554 1726853160.18660: done getting variables 7554 1726853160.18698: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:26:00 -0400 (0:00:00.028) 0:00:14.154 ****** 7554 1726853160.18720: entering _queue_task() for managed_node3/fail 7554 1726853160.18897: worker is 1 (out of 1 available) 7554 1726853160.18911: exiting _queue_task() for managed_node3/fail 7554 1726853160.18921: done queuing things up, now waiting for results queue to drain 7554 1726853160.18923: waiting for pending results... 7554 1726853160.19088: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7554 1726853160.19170: in run() - task 02083763-bbaf-bdc3-98b6-000000000019 7554 1726853160.19184: variable 'ansible_search_path' from source: unknown 7554 1726853160.19188: variable 'ansible_search_path' from source: unknown 7554 1726853160.19214: calling self._execute() 7554 1726853160.19284: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.19288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.19298: variable 'omit' from source: magic vars 7554 1726853160.19553: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.19562: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853160.19646: variable 'network_state' from source: role '' defaults 7554 1726853160.19653: Evaluated conditional (network_state != {}): False 7554 1726853160.19656: when evaluation is False, skipping this task 7554 1726853160.19659: _execute() done 7554 1726853160.19661: dumping result to json 7554 1726853160.19663: done dumping result, returning 7554 1726853160.19672: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-bdc3-98b6-000000000019] 7554 1726853160.19678: sending task result for task 02083763-bbaf-bdc3-98b6-000000000019 7554 1726853160.19761: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000019 7554 1726853160.19764: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853160.19833: no more pending results, returning what we have 7554 1726853160.19835: results queue empty 7554 1726853160.19836: checking for any_errors_fatal 7554 1726853160.19840: done checking for any_errors_fatal 7554 1726853160.19841: checking for max_fail_percentage 7554 1726853160.19842: done checking for max_fail_percentage 7554 1726853160.19843: checking to see if all hosts have failed and the running result is not ok 7554 1726853160.19844: done checking to see if all hosts have failed 7554 1726853160.19847: getting the remaining hosts for this loop 7554 1726853160.19848: done getting the remaining hosts for this loop 7554 1726853160.19851: getting the next task for host managed_node3 7554 1726853160.19855: done getting next task for host managed_node3 7554 1726853160.19859: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7554 1726853160.19861: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853160.19875: getting variables 7554 1726853160.19877: in VariableManager get_vars() 7554 1726853160.19906: Calling all_inventory to load vars for managed_node3 7554 1726853160.19908: Calling groups_inventory to load vars for managed_node3 7554 1726853160.19909: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853160.19915: Calling all_plugins_play to load vars for managed_node3 7554 1726853160.19916: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853160.19918: Calling groups_plugins_play to load vars for managed_node3 7554 1726853160.20666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853160.21496: done with get_vars() 7554 1726853160.21511: done getting variables 7554 1726853160.21548: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:26:00 -0400 (0:00:00.028) 0:00:14.183 ****** 7554 1726853160.21568: entering _queue_task() for managed_node3/fail 7554 1726853160.21741: worker is 1 (out of 1 available) 7554 1726853160.21754: exiting _queue_task() for managed_node3/fail 7554 1726853160.21765: done queuing things up, now waiting for results queue to drain 7554 1726853160.21767: waiting for pending results... 7554 1726853160.21930: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7554 1726853160.22012: in run() - task 02083763-bbaf-bdc3-98b6-00000000001a 7554 1726853160.22024: variable 'ansible_search_path' from source: unknown 7554 1726853160.22028: variable 'ansible_search_path' from source: unknown 7554 1726853160.22055: calling self._execute() 7554 1726853160.22122: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.22126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.22134: variable 'omit' from source: magic vars 7554 1726853160.22385: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.22394: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853160.22511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853160.23973: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853160.24021: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853160.24047: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853160.24175: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853160.24179: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853160.24182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.24185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.24188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.24215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.24227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.24294: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.24307: Evaluated conditional (ansible_distribution_major_version | int > 9): True 7554 1726853160.24382: variable 'ansible_distribution' from source: facts 7554 1726853160.24386: variable '__network_rh_distros' from source: role '' defaults 7554 1726853160.24396: Evaluated conditional (ansible_distribution in __network_rh_distros): True 7554 1726853160.24544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.24562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.24580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.24606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.24616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.24651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.24667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.24685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.24708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.24723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.24749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.24766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.24784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.24806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.24816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.25010: variable 'network_connections' from source: task vars 7554 1726853160.25019: variable 'interface' from source: play vars 7554 1726853160.25073: variable 'interface' from source: play vars 7554 1726853160.25084: variable 'network_state' from source: role '' defaults 7554 1726853160.25127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853160.25238: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853160.25269: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853160.25293: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853160.25314: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853160.25342: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853160.25360: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853160.25385: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.25403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853160.25429: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 7554 1726853160.25432: when evaluation is False, skipping this task 7554 1726853160.25435: _execute() done 7554 1726853160.25437: dumping result to json 7554 1726853160.25439: done dumping result, returning 7554 1726853160.25446: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-bdc3-98b6-00000000001a] 7554 1726853160.25453: sending task result for task 02083763-bbaf-bdc3-98b6-00000000001a 7554 1726853160.25529: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000001a 7554 1726853160.25532: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 7554 1726853160.25577: no more pending results, returning what we have 7554 1726853160.25581: results queue empty 7554 1726853160.25581: checking for any_errors_fatal 7554 1726853160.25588: done checking for any_errors_fatal 7554 1726853160.25589: checking for max_fail_percentage 7554 1726853160.25590: done checking for max_fail_percentage 7554 1726853160.25591: checking to see if all hosts have failed and the running result is not ok 7554 1726853160.25592: done checking to see if all hosts have failed 7554 1726853160.25592: getting the remaining hosts for this loop 7554 1726853160.25594: done getting the remaining hosts for this loop 7554 1726853160.25597: getting the next task for host managed_node3 7554 1726853160.25603: done getting next task for host managed_node3 7554 1726853160.25607: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7554 1726853160.25610: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853160.25623: getting variables 7554 1726853160.25624: in VariableManager get_vars() 7554 1726853160.25674: Calling all_inventory to load vars for managed_node3 7554 1726853160.25677: Calling groups_inventory to load vars for managed_node3 7554 1726853160.25679: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853160.25687: Calling all_plugins_play to load vars for managed_node3 7554 1726853160.25689: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853160.25691: Calling groups_plugins_play to load vars for managed_node3 7554 1726853160.26425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853160.27349: done with get_vars() 7554 1726853160.27363: done getting variables 7554 1726853160.27429: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:26:00 -0400 (0:00:00.058) 0:00:14.242 ****** 7554 1726853160.27450: entering _queue_task() for managed_node3/dnf 7554 1726853160.27633: worker is 1 (out of 1 available) 7554 1726853160.27645: exiting _queue_task() for managed_node3/dnf 7554 1726853160.27657: done queuing things up, now waiting for results queue to drain 7554 1726853160.27658: waiting for pending results... 7554 1726853160.27820: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7554 1726853160.27903: in run() - task 02083763-bbaf-bdc3-98b6-00000000001b 7554 1726853160.27915: variable 'ansible_search_path' from source: unknown 7554 1726853160.27918: variable 'ansible_search_path' from source: unknown 7554 1726853160.27944: calling self._execute() 7554 1726853160.28011: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.28015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.28024: variable 'omit' from source: magic vars 7554 1726853160.28283: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.28292: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853160.28420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853160.29858: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853160.29907: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853160.29932: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853160.29962: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853160.29984: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853160.30035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.30059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.30080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.30107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.30118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.30195: variable 'ansible_distribution' from source: facts 7554 1726853160.30199: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.30211: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7554 1726853160.30289: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853160.30366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.30384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.30404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.30428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.30439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.30469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.30487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.30506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.30529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.30539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.30568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.30584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.30600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.30628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.30639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.30738: variable 'network_connections' from source: task vars 7554 1726853160.30747: variable 'interface' from source: play vars 7554 1726853160.30797: variable 'interface' from source: play vars 7554 1726853160.30847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853160.30959: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853160.30985: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853160.31006: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853160.31027: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853160.31062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853160.31078: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853160.31099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.31116: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853160.31159: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853160.31320: variable 'network_connections' from source: task vars 7554 1726853160.31324: variable 'interface' from source: play vars 7554 1726853160.31367: variable 'interface' from source: play vars 7554 1726853160.31394: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7554 1726853160.31397: when evaluation is False, skipping this task 7554 1726853160.31399: _execute() done 7554 1726853160.31402: dumping result to json 7554 1726853160.31404: done dumping result, returning 7554 1726853160.31410: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-00000000001b] 7554 1726853160.31415: sending task result for task 02083763-bbaf-bdc3-98b6-00000000001b 7554 1726853160.31499: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000001b 7554 1726853160.31502: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7554 1726853160.31550: no more pending results, returning what we have 7554 1726853160.31553: results queue empty 7554 1726853160.31554: checking for any_errors_fatal 7554 1726853160.31559: done checking for any_errors_fatal 7554 1726853160.31560: checking for max_fail_percentage 7554 1726853160.31561: done checking for max_fail_percentage 7554 1726853160.31562: checking to see if all hosts have failed and the running result is not ok 7554 1726853160.31563: done checking to see if all hosts have failed 7554 1726853160.31564: getting the remaining hosts for this loop 7554 1726853160.31565: done getting the remaining hosts for this loop 7554 1726853160.31568: getting the next task for host managed_node3 7554 1726853160.31576: done getting next task for host managed_node3 7554 1726853160.31579: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7554 1726853160.31582: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853160.31594: getting variables 7554 1726853160.31596: in VariableManager get_vars() 7554 1726853160.31642: Calling all_inventory to load vars for managed_node3 7554 1726853160.31647: Calling groups_inventory to load vars for managed_node3 7554 1726853160.31649: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853160.31657: Calling all_plugins_play to load vars for managed_node3 7554 1726853160.31659: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853160.31661: Calling groups_plugins_play to load vars for managed_node3 7554 1726853160.32398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853160.33238: done with get_vars() 7554 1726853160.33255: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7554 1726853160.33305: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:26:00 -0400 (0:00:00.058) 0:00:14.300 ****** 7554 1726853160.33325: entering _queue_task() for managed_node3/yum 7554 1726853160.33326: Creating lock for yum 7554 1726853160.33525: worker is 1 (out of 1 available) 7554 1726853160.33538: exiting _queue_task() for managed_node3/yum 7554 1726853160.33549: done queuing things up, now waiting for results queue to drain 7554 1726853160.33551: waiting for pending results... 7554 1726853160.33725: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7554 1726853160.33809: in run() - task 02083763-bbaf-bdc3-98b6-00000000001c 7554 1726853160.33821: variable 'ansible_search_path' from source: unknown 7554 1726853160.33825: variable 'ansible_search_path' from source: unknown 7554 1726853160.33854: calling self._execute() 7554 1726853160.33924: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.33928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.33936: variable 'omit' from source: magic vars 7554 1726853160.34203: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.34215: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853160.34335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853160.35797: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853160.35847: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853160.35877: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853160.35901: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853160.35920: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853160.35979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.35998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.36015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.36040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.36055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.36119: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.36132: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7554 1726853160.36134: when evaluation is False, skipping this task 7554 1726853160.36137: _execute() done 7554 1726853160.36140: dumping result to json 7554 1726853160.36142: done dumping result, returning 7554 1726853160.36151: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-00000000001c] 7554 1726853160.36155: sending task result for task 02083763-bbaf-bdc3-98b6-00000000001c 7554 1726853160.36236: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000001c 7554 1726853160.36239: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7554 1726853160.36322: no more pending results, returning what we have 7554 1726853160.36324: results queue empty 7554 1726853160.36325: checking for any_errors_fatal 7554 1726853160.36329: done checking for any_errors_fatal 7554 1726853160.36330: checking for max_fail_percentage 7554 1726853160.36332: done checking for max_fail_percentage 7554 1726853160.36332: checking to see if all hosts have failed and the running result is not ok 7554 1726853160.36333: done checking to see if all hosts have failed 7554 1726853160.36334: getting the remaining hosts for this loop 7554 1726853160.36335: done getting the remaining hosts for this loop 7554 1726853160.36338: getting the next task for host managed_node3 7554 1726853160.36343: done getting next task for host managed_node3 7554 1726853160.36346: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7554 1726853160.36349: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853160.36361: getting variables 7554 1726853160.36362: in VariableManager get_vars() 7554 1726853160.36403: Calling all_inventory to load vars for managed_node3 7554 1726853160.36406: Calling groups_inventory to load vars for managed_node3 7554 1726853160.36408: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853160.36415: Calling all_plugins_play to load vars for managed_node3 7554 1726853160.36418: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853160.36420: Calling groups_plugins_play to load vars for managed_node3 7554 1726853160.39570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853160.40400: done with get_vars() 7554 1726853160.40415: done getting variables 7554 1726853160.40446: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:26:00 -0400 (0:00:00.071) 0:00:14.372 ****** 7554 1726853160.40467: entering _queue_task() for managed_node3/fail 7554 1726853160.40696: worker is 1 (out of 1 available) 7554 1726853160.40709: exiting _queue_task() for managed_node3/fail 7554 1726853160.40720: done queuing things up, now waiting for results queue to drain 7554 1726853160.40722: waiting for pending results... 7554 1726853160.40967: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7554 1726853160.41011: in run() - task 02083763-bbaf-bdc3-98b6-00000000001d 7554 1726853160.41023: variable 'ansible_search_path' from source: unknown 7554 1726853160.41026: variable 'ansible_search_path' from source: unknown 7554 1726853160.41057: calling self._execute() 7554 1726853160.41126: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.41131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.41140: variable 'omit' from source: magic vars 7554 1726853160.41415: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.41425: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853160.41508: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853160.41637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853160.43062: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853160.43113: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853160.43143: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853160.43169: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853160.43190: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853160.43247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.43268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.43287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.43312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.43322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.43356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.43376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.43393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.43417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.43428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.43458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.43478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.43494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.43517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.43527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.43639: variable 'network_connections' from source: task vars 7554 1726853160.43650: variable 'interface' from source: play vars 7554 1726853160.43702: variable 'interface' from source: play vars 7554 1726853160.43752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853160.43867: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853160.43897: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853160.43920: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853160.43940: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853160.43972: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853160.43989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853160.44007: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.44026: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853160.44074: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853160.44222: variable 'network_connections' from source: task vars 7554 1726853160.44233: variable 'interface' from source: play vars 7554 1726853160.44274: variable 'interface' from source: play vars 7554 1726853160.44299: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7554 1726853160.44303: when evaluation is False, skipping this task 7554 1726853160.44306: _execute() done 7554 1726853160.44308: dumping result to json 7554 1726853160.44311: done dumping result, returning 7554 1726853160.44320: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-00000000001d] 7554 1726853160.44323: sending task result for task 02083763-bbaf-bdc3-98b6-00000000001d 7554 1726853160.44415: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000001d 7554 1726853160.44418: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7554 1726853160.44475: no more pending results, returning what we have 7554 1726853160.44478: results queue empty 7554 1726853160.44479: checking for any_errors_fatal 7554 1726853160.44485: done checking for any_errors_fatal 7554 1726853160.44485: checking for max_fail_percentage 7554 1726853160.44487: done checking for max_fail_percentage 7554 1726853160.44488: checking to see if all hosts have failed and the running result is not ok 7554 1726853160.44489: done checking to see if all hosts have failed 7554 1726853160.44489: getting the remaining hosts for this loop 7554 1726853160.44491: done getting the remaining hosts for this loop 7554 1726853160.44494: getting the next task for host managed_node3 7554 1726853160.44500: done getting next task for host managed_node3 7554 1726853160.44503: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7554 1726853160.44506: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853160.44520: getting variables 7554 1726853160.44521: in VariableManager get_vars() 7554 1726853160.44566: Calling all_inventory to load vars for managed_node3 7554 1726853160.44569: Calling groups_inventory to load vars for managed_node3 7554 1726853160.44572: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853160.44581: Calling all_plugins_play to load vars for managed_node3 7554 1726853160.44583: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853160.44585: Calling groups_plugins_play to load vars for managed_node3 7554 1726853160.45355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853160.46228: done with get_vars() 7554 1726853160.46243: done getting variables 7554 1726853160.46286: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:26:00 -0400 (0:00:00.058) 0:00:14.430 ****** 7554 1726853160.46311: entering _queue_task() for managed_node3/package 7554 1726853160.46532: worker is 1 (out of 1 available) 7554 1726853160.46548: exiting _queue_task() for managed_node3/package 7554 1726853160.46560: done queuing things up, now waiting for results queue to drain 7554 1726853160.46561: waiting for pending results... 7554 1726853160.46739: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7554 1726853160.46835: in run() - task 02083763-bbaf-bdc3-98b6-00000000001e 7554 1726853160.46846: variable 'ansible_search_path' from source: unknown 7554 1726853160.46852: variable 'ansible_search_path' from source: unknown 7554 1726853160.46882: calling self._execute() 7554 1726853160.46958: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.46964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.46973: variable 'omit' from source: magic vars 7554 1726853160.47253: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.47264: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853160.47396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853160.47585: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853160.47616: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853160.47676: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853160.47703: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853160.47786: variable 'network_packages' from source: role '' defaults 7554 1726853160.47856: variable '__network_provider_setup' from source: role '' defaults 7554 1726853160.47864: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853160.47916: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853160.47924: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853160.47967: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853160.48083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853160.49643: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853160.49686: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853160.49711: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853160.49736: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853160.49759: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853160.49815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.49835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.49859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.49886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.49897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.49928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.49951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.49966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.49992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.50003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.50139: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7554 1726853160.50214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.50242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.50260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.50290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.50300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.50358: variable 'ansible_python' from source: facts 7554 1726853160.50382: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7554 1726853160.50435: variable '__network_wpa_supplicant_required' from source: role '' defaults 7554 1726853160.50490: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7554 1726853160.50570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.50587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.50607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.50632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.50642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.50675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.50694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.50715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.50737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.50750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.50841: variable 'network_connections' from source: task vars 7554 1726853160.50849: variable 'interface' from source: play vars 7554 1726853160.50915: variable 'interface' from source: play vars 7554 1726853160.50968: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853160.50988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853160.51008: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.51035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853160.51066: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853160.51240: variable 'network_connections' from source: task vars 7554 1726853160.51251: variable 'interface' from source: play vars 7554 1726853160.51316: variable 'interface' from source: play vars 7554 1726853160.51353: variable '__network_packages_default_wireless' from source: role '' defaults 7554 1726853160.51410: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853160.51599: variable 'network_connections' from source: task vars 7554 1726853160.51602: variable 'interface' from source: play vars 7554 1726853160.51649: variable 'interface' from source: play vars 7554 1726853160.51666: variable '__network_packages_default_team' from source: role '' defaults 7554 1726853160.51723: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853160.51912: variable 'network_connections' from source: task vars 7554 1726853160.51916: variable 'interface' from source: play vars 7554 1726853160.51961: variable 'interface' from source: play vars 7554 1726853160.52005: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853160.52051: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853160.52054: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853160.52096: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853160.52230: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7554 1726853160.52517: variable 'network_connections' from source: task vars 7554 1726853160.52521: variable 'interface' from source: play vars 7554 1726853160.52569: variable 'interface' from source: play vars 7554 1726853160.52574: variable 'ansible_distribution' from source: facts 7554 1726853160.52577: variable '__network_rh_distros' from source: role '' defaults 7554 1726853160.52583: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.52599: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7554 1726853160.52705: variable 'ansible_distribution' from source: facts 7554 1726853160.52708: variable '__network_rh_distros' from source: role '' defaults 7554 1726853160.52713: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.52724: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7554 1726853160.52831: variable 'ansible_distribution' from source: facts 7554 1726853160.52834: variable '__network_rh_distros' from source: role '' defaults 7554 1726853160.52839: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.52864: variable 'network_provider' from source: set_fact 7554 1726853160.52877: variable 'ansible_facts' from source: unknown 7554 1726853160.53238: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7554 1726853160.53241: when evaluation is False, skipping this task 7554 1726853160.53243: _execute() done 7554 1726853160.53248: dumping result to json 7554 1726853160.53250: done dumping result, returning 7554 1726853160.53256: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-bdc3-98b6-00000000001e] 7554 1726853160.53261: sending task result for task 02083763-bbaf-bdc3-98b6-00000000001e 7554 1726853160.53350: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000001e 7554 1726853160.53352: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7554 1726853160.53401: no more pending results, returning what we have 7554 1726853160.53404: results queue empty 7554 1726853160.53405: checking for any_errors_fatal 7554 1726853160.53411: done checking for any_errors_fatal 7554 1726853160.53412: checking for max_fail_percentage 7554 1726853160.53414: done checking for max_fail_percentage 7554 1726853160.53414: checking to see if all hosts have failed and the running result is not ok 7554 1726853160.53415: done checking to see if all hosts have failed 7554 1726853160.53416: getting the remaining hosts for this loop 7554 1726853160.53417: done getting the remaining hosts for this loop 7554 1726853160.53421: getting the next task for host managed_node3 7554 1726853160.53427: done getting next task for host managed_node3 7554 1726853160.53431: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7554 1726853160.53433: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853160.53449: getting variables 7554 1726853160.53451: in VariableManager get_vars() 7554 1726853160.53501: Calling all_inventory to load vars for managed_node3 7554 1726853160.53503: Calling groups_inventory to load vars for managed_node3 7554 1726853160.53505: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853160.53514: Calling all_plugins_play to load vars for managed_node3 7554 1726853160.53517: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853160.53520: Calling groups_plugins_play to load vars for managed_node3 7554 1726853160.54436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853160.55854: done with get_vars() 7554 1726853160.55879: done getting variables 7554 1726853160.55942: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:26:00 -0400 (0:00:00.096) 0:00:14.527 ****** 7554 1726853160.55981: entering _queue_task() for managed_node3/package 7554 1726853160.56252: worker is 1 (out of 1 available) 7554 1726853160.56265: exiting _queue_task() for managed_node3/package 7554 1726853160.56479: done queuing things up, now waiting for results queue to drain 7554 1726853160.56481: waiting for pending results... 7554 1726853160.56690: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7554 1726853160.56729: in run() - task 02083763-bbaf-bdc3-98b6-00000000001f 7554 1726853160.56755: variable 'ansible_search_path' from source: unknown 7554 1726853160.56765: variable 'ansible_search_path' from source: unknown 7554 1726853160.56806: calling self._execute() 7554 1726853160.56922: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.56940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.57040: variable 'omit' from source: magic vars 7554 1726853160.57388: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.57408: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853160.57537: variable 'network_state' from source: role '' defaults 7554 1726853160.57557: Evaluated conditional (network_state != {}): False 7554 1726853160.57566: when evaluation is False, skipping this task 7554 1726853160.57579: _execute() done 7554 1726853160.57593: dumping result to json 7554 1726853160.57601: done dumping result, returning 7554 1726853160.57615: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-bdc3-98b6-00000000001f] 7554 1726853160.57626: sending task result for task 02083763-bbaf-bdc3-98b6-00000000001f skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853160.57881: no more pending results, returning what we have 7554 1726853160.57885: results queue empty 7554 1726853160.57886: checking for any_errors_fatal 7554 1726853160.57893: done checking for any_errors_fatal 7554 1726853160.57894: checking for max_fail_percentage 7554 1726853160.57896: done checking for max_fail_percentage 7554 1726853160.57896: checking to see if all hosts have failed and the running result is not ok 7554 1726853160.57898: done checking to see if all hosts have failed 7554 1726853160.57898: getting the remaining hosts for this loop 7554 1726853160.57900: done getting the remaining hosts for this loop 7554 1726853160.57903: getting the next task for host managed_node3 7554 1726853160.57910: done getting next task for host managed_node3 7554 1726853160.57913: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7554 1726853160.57917: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853160.57934: getting variables 7554 1726853160.57936: in VariableManager get_vars() 7554 1726853160.57993: Calling all_inventory to load vars for managed_node3 7554 1726853160.57997: Calling groups_inventory to load vars for managed_node3 7554 1726853160.57999: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853160.58012: Calling all_plugins_play to load vars for managed_node3 7554 1726853160.58015: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853160.58019: Calling groups_plugins_play to load vars for managed_node3 7554 1726853160.58593: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000001f 7554 1726853160.58596: WORKER PROCESS EXITING 7554 1726853160.59536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853160.60418: done with get_vars() 7554 1726853160.60433: done getting variables 7554 1726853160.60479: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:26:00 -0400 (0:00:00.045) 0:00:14.572 ****** 7554 1726853160.60503: entering _queue_task() for managed_node3/package 7554 1726853160.60731: worker is 1 (out of 1 available) 7554 1726853160.60747: exiting _queue_task() for managed_node3/package 7554 1726853160.60758: done queuing things up, now waiting for results queue to drain 7554 1726853160.60759: waiting for pending results... 7554 1726853160.60932: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7554 1726853160.61027: in run() - task 02083763-bbaf-bdc3-98b6-000000000020 7554 1726853160.61040: variable 'ansible_search_path' from source: unknown 7554 1726853160.61046: variable 'ansible_search_path' from source: unknown 7554 1726853160.61073: calling self._execute() 7554 1726853160.61148: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.61152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.61159: variable 'omit' from source: magic vars 7554 1726853160.61427: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.61441: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853160.61524: variable 'network_state' from source: role '' defaults 7554 1726853160.61537: Evaluated conditional (network_state != {}): False 7554 1726853160.61540: when evaluation is False, skipping this task 7554 1726853160.61543: _execute() done 7554 1726853160.61545: dumping result to json 7554 1726853160.61548: done dumping result, returning 7554 1726853160.61557: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-bdc3-98b6-000000000020] 7554 1726853160.61561: sending task result for task 02083763-bbaf-bdc3-98b6-000000000020 7554 1726853160.61648: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000020 7554 1726853160.61651: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853160.61696: no more pending results, returning what we have 7554 1726853160.61699: results queue empty 7554 1726853160.61700: checking for any_errors_fatal 7554 1726853160.61708: done checking for any_errors_fatal 7554 1726853160.61708: checking for max_fail_percentage 7554 1726853160.61710: done checking for max_fail_percentage 7554 1726853160.61710: checking to see if all hosts have failed and the running result is not ok 7554 1726853160.61711: done checking to see if all hosts have failed 7554 1726853160.61712: getting the remaining hosts for this loop 7554 1726853160.61713: done getting the remaining hosts for this loop 7554 1726853160.61717: getting the next task for host managed_node3 7554 1726853160.61722: done getting next task for host managed_node3 7554 1726853160.61726: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7554 1726853160.61729: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853160.61744: getting variables 7554 1726853160.61746: in VariableManager get_vars() 7554 1726853160.61787: Calling all_inventory to load vars for managed_node3 7554 1726853160.61789: Calling groups_inventory to load vars for managed_node3 7554 1726853160.61791: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853160.61799: Calling all_plugins_play to load vars for managed_node3 7554 1726853160.61802: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853160.61804: Calling groups_plugins_play to load vars for managed_node3 7554 1726853160.62626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853160.63462: done with get_vars() 7554 1726853160.63478: done getting variables 7554 1726853160.63548: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:26:00 -0400 (0:00:00.030) 0:00:14.603 ****** 7554 1726853160.63569: entering _queue_task() for managed_node3/service 7554 1726853160.63570: Creating lock for service 7554 1726853160.63788: worker is 1 (out of 1 available) 7554 1726853160.63803: exiting _queue_task() for managed_node3/service 7554 1726853160.63815: done queuing things up, now waiting for results queue to drain 7554 1726853160.63817: waiting for pending results... 7554 1726853160.63991: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7554 1726853160.64076: in run() - task 02083763-bbaf-bdc3-98b6-000000000021 7554 1726853160.64088: variable 'ansible_search_path' from source: unknown 7554 1726853160.64091: variable 'ansible_search_path' from source: unknown 7554 1726853160.64118: calling self._execute() 7554 1726853160.64192: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.64196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.64205: variable 'omit' from source: magic vars 7554 1726853160.64475: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.64489: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853160.64567: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853160.64703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853160.66176: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853160.66227: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853160.66255: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853160.66282: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853160.66302: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853160.66362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.66383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.66400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.66425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.66441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.66475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.66491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.66507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.66531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.66547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.66575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.66590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.66606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.66630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.66640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.66765: variable 'network_connections' from source: task vars 7554 1726853160.66769: variable 'interface' from source: play vars 7554 1726853160.66821: variable 'interface' from source: play vars 7554 1726853160.66875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853160.66984: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853160.67019: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853160.67042: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853160.67065: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853160.67100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853160.67116: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853160.67133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.67153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853160.67203: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853160.67355: variable 'network_connections' from source: task vars 7554 1726853160.67358: variable 'interface' from source: play vars 7554 1726853160.67402: variable 'interface' from source: play vars 7554 1726853160.67432: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7554 1726853160.67436: when evaluation is False, skipping this task 7554 1726853160.67438: _execute() done 7554 1726853160.67441: dumping result to json 7554 1726853160.67443: done dumping result, returning 7554 1726853160.67450: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-000000000021] 7554 1726853160.67456: sending task result for task 02083763-bbaf-bdc3-98b6-000000000021 7554 1726853160.67542: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000021 7554 1726853160.67551: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7554 1726853160.67598: no more pending results, returning what we have 7554 1726853160.67602: results queue empty 7554 1726853160.67602: checking for any_errors_fatal 7554 1726853160.67609: done checking for any_errors_fatal 7554 1726853160.67610: checking for max_fail_percentage 7554 1726853160.67611: done checking for max_fail_percentage 7554 1726853160.67612: checking to see if all hosts have failed and the running result is not ok 7554 1726853160.67613: done checking to see if all hosts have failed 7554 1726853160.67614: getting the remaining hosts for this loop 7554 1726853160.67615: done getting the remaining hosts for this loop 7554 1726853160.67618: getting the next task for host managed_node3 7554 1726853160.67624: done getting next task for host managed_node3 7554 1726853160.67628: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7554 1726853160.67631: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853160.67644: getting variables 7554 1726853160.67646: in VariableManager get_vars() 7554 1726853160.67697: Calling all_inventory to load vars for managed_node3 7554 1726853160.67700: Calling groups_inventory to load vars for managed_node3 7554 1726853160.67702: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853160.67710: Calling all_plugins_play to load vars for managed_node3 7554 1726853160.67712: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853160.67715: Calling groups_plugins_play to load vars for managed_node3 7554 1726853160.68486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853160.69432: done with get_vars() 7554 1726853160.69447: done getting variables 7554 1726853160.69490: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:26:00 -0400 (0:00:00.059) 0:00:14.662 ****** 7554 1726853160.69514: entering _queue_task() for managed_node3/service 7554 1726853160.69743: worker is 1 (out of 1 available) 7554 1726853160.69759: exiting _queue_task() for managed_node3/service 7554 1726853160.69773: done queuing things up, now waiting for results queue to drain 7554 1726853160.69775: waiting for pending results... 7554 1726853160.69951: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7554 1726853160.70050: in run() - task 02083763-bbaf-bdc3-98b6-000000000022 7554 1726853160.70064: variable 'ansible_search_path' from source: unknown 7554 1726853160.70067: variable 'ansible_search_path' from source: unknown 7554 1726853160.70103: calling self._execute() 7554 1726853160.70180: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.70185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.70194: variable 'omit' from source: magic vars 7554 1726853160.70469: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.70481: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853160.70589: variable 'network_provider' from source: set_fact 7554 1726853160.70593: variable 'network_state' from source: role '' defaults 7554 1726853160.70602: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7554 1726853160.70608: variable 'omit' from source: magic vars 7554 1726853160.70652: variable 'omit' from source: magic vars 7554 1726853160.70675: variable 'network_service_name' from source: role '' defaults 7554 1726853160.70726: variable 'network_service_name' from source: role '' defaults 7554 1726853160.70799: variable '__network_provider_setup' from source: role '' defaults 7554 1726853160.70804: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853160.70851: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853160.70857: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853160.70904: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853160.71051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853160.72494: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853160.72549: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853160.72576: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853160.72600: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853160.72624: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853160.72681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.72702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.72719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.72748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.72764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.72876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.72879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.72881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.72892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.72904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.73180: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7554 1726853160.73209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.73282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.73480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.73483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.73486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.73488: variable 'ansible_python' from source: facts 7554 1726853160.73491: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7554 1726853160.73493: variable '__network_wpa_supplicant_required' from source: role '' defaults 7554 1726853160.73555: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7554 1726853160.73666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.73681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.73698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.73726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.73736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.73776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853160.73801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853160.73814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.73849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853160.73858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853160.73945: variable 'network_connections' from source: task vars 7554 1726853160.73967: variable 'interface' from source: play vars 7554 1726853160.74018: variable 'interface' from source: play vars 7554 1726853160.74106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853160.74241: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853160.74291: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853160.74319: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853160.74350: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853160.74397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853160.74427: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853160.74481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853160.74484: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853160.74534: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853160.74737: variable 'network_connections' from source: task vars 7554 1726853160.74742: variable 'interface' from source: play vars 7554 1726853160.74829: variable 'interface' from source: play vars 7554 1726853160.74865: variable '__network_packages_default_wireless' from source: role '' defaults 7554 1726853160.74915: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853160.75158: variable 'network_connections' from source: task vars 7554 1726853160.75161: variable 'interface' from source: play vars 7554 1726853160.75218: variable 'interface' from source: play vars 7554 1726853160.75231: variable '__network_packages_default_team' from source: role '' defaults 7554 1726853160.75335: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853160.75533: variable 'network_connections' from source: task vars 7554 1726853160.75536: variable 'interface' from source: play vars 7554 1726853160.75589: variable 'interface' from source: play vars 7554 1726853160.75647: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853160.75758: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853160.75763: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853160.75766: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853160.75909: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7554 1726853160.76235: variable 'network_connections' from source: task vars 7554 1726853160.76239: variable 'interface' from source: play vars 7554 1726853160.76283: variable 'interface' from source: play vars 7554 1726853160.76292: variable 'ansible_distribution' from source: facts 7554 1726853160.76294: variable '__network_rh_distros' from source: role '' defaults 7554 1726853160.76300: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.76319: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7554 1726853160.76430: variable 'ansible_distribution' from source: facts 7554 1726853160.76434: variable '__network_rh_distros' from source: role '' defaults 7554 1726853160.76436: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.76476: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7554 1726853160.76560: variable 'ansible_distribution' from source: facts 7554 1726853160.76563: variable '__network_rh_distros' from source: role '' defaults 7554 1726853160.76568: variable 'ansible_distribution_major_version' from source: facts 7554 1726853160.76595: variable 'network_provider' from source: set_fact 7554 1726853160.76612: variable 'omit' from source: magic vars 7554 1726853160.76635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853160.76659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853160.76676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853160.76689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853160.76698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853160.76720: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853160.76723: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.76725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.76796: Set connection var ansible_shell_executable to /bin/sh 7554 1726853160.76803: Set connection var ansible_pipelining to False 7554 1726853160.76805: Set connection var ansible_shell_type to sh 7554 1726853160.76808: Set connection var ansible_connection to ssh 7554 1726853160.76815: Set connection var ansible_timeout to 10 7554 1726853160.76820: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853160.76838: variable 'ansible_shell_executable' from source: unknown 7554 1726853160.76841: variable 'ansible_connection' from source: unknown 7554 1726853160.76843: variable 'ansible_module_compression' from source: unknown 7554 1726853160.76848: variable 'ansible_shell_type' from source: unknown 7554 1726853160.76850: variable 'ansible_shell_executable' from source: unknown 7554 1726853160.76852: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853160.76855: variable 'ansible_pipelining' from source: unknown 7554 1726853160.76857: variable 'ansible_timeout' from source: unknown 7554 1726853160.76860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853160.76943: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853160.76952: variable 'omit' from source: magic vars 7554 1726853160.76960: starting attempt loop 7554 1726853160.76962: running the handler 7554 1726853160.77019: variable 'ansible_facts' from source: unknown 7554 1726853160.77474: _low_level_execute_command(): starting 7554 1726853160.77481: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853160.77976: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853160.77981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853160.77984: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853160.77986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853160.78036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853160.78039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853160.78041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853160.78116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853160.79852: stdout chunk (state=3): >>>/root <<< 7554 1726853160.79949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853160.79983: stderr chunk (state=3): >>><<< 7554 1726853160.79986: stdout chunk (state=3): >>><<< 7554 1726853160.80004: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853160.80014: _low_level_execute_command(): starting 7554 1726853160.80021: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853160.800045-8141-158915100031287 `" && echo ansible-tmp-1726853160.800045-8141-158915100031287="` echo /root/.ansible/tmp/ansible-tmp-1726853160.800045-8141-158915100031287 `" ) && sleep 0' 7554 1726853160.80485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853160.80488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853160.80491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853160.80494: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853160.80496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853160.80541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853160.80544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853160.80549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853160.80610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853160.82566: stdout chunk (state=3): >>>ansible-tmp-1726853160.800045-8141-158915100031287=/root/.ansible/tmp/ansible-tmp-1726853160.800045-8141-158915100031287 <<< 7554 1726853160.82674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853160.82703: stderr chunk (state=3): >>><<< 7554 1726853160.82706: stdout chunk (state=3): >>><<< 7554 1726853160.82719: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853160.800045-8141-158915100031287=/root/.ansible/tmp/ansible-tmp-1726853160.800045-8141-158915100031287 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853160.82753: variable 'ansible_module_compression' from source: unknown 7554 1726853160.82798: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 7554 1726853160.82802: ANSIBALLZ: Acquiring lock 7554 1726853160.82805: ANSIBALLZ: Lock acquired: 140257826526304 7554 1726853160.82807: ANSIBALLZ: Creating module 7554 1726853161.02479: ANSIBALLZ: Writing module into payload 7554 1726853161.02581: ANSIBALLZ: Writing module 7554 1726853161.02612: ANSIBALLZ: Renaming module 7554 1726853161.02626: ANSIBALLZ: Done creating module 7554 1726853161.02669: variable 'ansible_facts' from source: unknown 7554 1726853161.02880: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853160.800045-8141-158915100031287/AnsiballZ_systemd.py 7554 1726853161.03122: Sending initial data 7554 1726853161.03125: Sent initial data (153 bytes) 7554 1726853161.03631: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853161.03646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853161.03752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853161.03777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853161.03887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853161.05554: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853161.05611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853161.05669: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpfmjs4ot3 /root/.ansible/tmp/ansible-tmp-1726853160.800045-8141-158915100031287/AnsiballZ_systemd.py <<< 7554 1726853161.05674: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853160.800045-8141-158915100031287/AnsiballZ_systemd.py" <<< 7554 1726853161.05727: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpfmjs4ot3" to remote "/root/.ansible/tmp/ansible-tmp-1726853160.800045-8141-158915100031287/AnsiballZ_systemd.py" <<< 7554 1726853161.05730: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853160.800045-8141-158915100031287/AnsiballZ_systemd.py" <<< 7554 1726853161.07104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853161.07108: stdout chunk (state=3): >>><<< 7554 1726853161.07110: stderr chunk (state=3): >>><<< 7554 1726853161.07112: done transferring module to remote 7554 1726853161.07114: _low_level_execute_command(): starting 7554 1726853161.07117: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853160.800045-8141-158915100031287/ /root/.ansible/tmp/ansible-tmp-1726853160.800045-8141-158915100031287/AnsiballZ_systemd.py && sleep 0' 7554 1726853161.07610: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853161.07654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853161.07691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853161.07707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853161.07785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853161.09978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853161.09982: stdout chunk (state=3): >>><<< 7554 1726853161.09985: stderr chunk (state=3): >>><<< 7554 1726853161.09989: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853161.09992: _low_level_execute_command(): starting 7554 1726853161.09995: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853160.800045-8141-158915100031287/AnsiballZ_systemd.py && sleep 0' 7554 1726853161.10383: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853161.10396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853161.10411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853161.10430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853161.10526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853161.10553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853161.10656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853161.40820: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9363456", "MemoryPeak": "9883648", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3344785408", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "92585000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service multi-user.target cloud-init.service network.target", "After": "cloud-init-local.service network-pre.target dbus.socket system.slice sysinit.target dbus-broker.service basic.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:21:20 EDT", "StateChangeTimestampMonotonic": "24855925", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7554 1726853161.43219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853161.43223: stdout chunk (state=3): >>><<< 7554 1726853161.43226: stderr chunk (state=3): >>><<< 7554 1726853161.43382: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9363456", "MemoryPeak": "9883648", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3344785408", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "92585000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service multi-user.target cloud-init.service network.target", "After": "cloud-init-local.service network-pre.target dbus.socket system.slice sysinit.target dbus-broker.service basic.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:21:20 EDT", "StateChangeTimestampMonotonic": "24855925", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853161.43489: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853160.800045-8141-158915100031287/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853161.43516: _low_level_execute_command(): starting 7554 1726853161.43527: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853160.800045-8141-158915100031287/ > /dev/null 2>&1 && sleep 0' 7554 1726853161.44014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853161.44027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853161.44039: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853161.44083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853161.44099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853161.44165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853161.46188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853161.46191: stdout chunk (state=3): >>><<< 7554 1726853161.46194: stderr chunk (state=3): >>><<< 7554 1726853161.46580: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853161.46584: handler run complete 7554 1726853161.46587: attempt loop complete, returning result 7554 1726853161.46589: _execute() done 7554 1726853161.46591: dumping result to json 7554 1726853161.46593: done dumping result, returning 7554 1726853161.46595: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-bdc3-98b6-000000000022] 7554 1726853161.46597: sending task result for task 02083763-bbaf-bdc3-98b6-000000000022 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853161.47349: no more pending results, returning what we have 7554 1726853161.47352: results queue empty 7554 1726853161.47353: checking for any_errors_fatal 7554 1726853161.47361: done checking for any_errors_fatal 7554 1726853161.47362: checking for max_fail_percentage 7554 1726853161.47363: done checking for max_fail_percentage 7554 1726853161.47364: checking to see if all hosts have failed and the running result is not ok 7554 1726853161.47365: done checking to see if all hosts have failed 7554 1726853161.47366: getting the remaining hosts for this loop 7554 1726853161.47367: done getting the remaining hosts for this loop 7554 1726853161.47375: getting the next task for host managed_node3 7554 1726853161.47381: done getting next task for host managed_node3 7554 1726853161.47384: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7554 1726853161.47387: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853161.47398: getting variables 7554 1726853161.47400: in VariableManager get_vars() 7554 1726853161.47448: Calling all_inventory to load vars for managed_node3 7554 1726853161.47451: Calling groups_inventory to load vars for managed_node3 7554 1726853161.47454: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853161.47463: Calling all_plugins_play to load vars for managed_node3 7554 1726853161.47466: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853161.47469: Calling groups_plugins_play to load vars for managed_node3 7554 1726853161.48287: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000022 7554 1726853161.48291: WORKER PROCESS EXITING 7554 1726853161.49169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853161.50599: done with get_vars() 7554 1726853161.50634: done getting variables 7554 1726853161.50702: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:26:01 -0400 (0:00:00.812) 0:00:15.474 ****** 7554 1726853161.50736: entering _queue_task() for managed_node3/service 7554 1726853161.51083: worker is 1 (out of 1 available) 7554 1726853161.51098: exiting _queue_task() for managed_node3/service 7554 1726853161.51110: done queuing things up, now waiting for results queue to drain 7554 1726853161.51112: waiting for pending results... 7554 1726853161.51410: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7554 1726853161.51577: in run() - task 02083763-bbaf-bdc3-98b6-000000000023 7554 1726853161.51599: variable 'ansible_search_path' from source: unknown 7554 1726853161.51607: variable 'ansible_search_path' from source: unknown 7554 1726853161.51657: calling self._execute() 7554 1726853161.51774: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853161.51787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853161.51803: variable 'omit' from source: magic vars 7554 1726853161.52213: variable 'ansible_distribution_major_version' from source: facts 7554 1726853161.52236: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853161.52375: variable 'network_provider' from source: set_fact 7554 1726853161.52388: Evaluated conditional (network_provider == "nm"): True 7554 1726853161.52486: variable '__network_wpa_supplicant_required' from source: role '' defaults 7554 1726853161.52584: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7554 1726853161.52766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853161.55164: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853161.55241: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853161.55288: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853161.55330: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853161.55366: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853161.55458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853161.55496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853161.55540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853161.55648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853161.55652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853161.55655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853161.55687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853161.55716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853161.55762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853161.55790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853161.55836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853161.55878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853161.55902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853161.55983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853161.55986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853161.56276: variable 'network_connections' from source: task vars 7554 1726853161.56279: variable 'interface' from source: play vars 7554 1726853161.56282: variable 'interface' from source: play vars 7554 1726853161.56315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853161.56510: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853161.56555: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853161.56594: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853161.56630: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853161.56684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853161.56711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853161.56749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853161.56786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853161.56847: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853161.57109: variable 'network_connections' from source: task vars 7554 1726853161.57119: variable 'interface' from source: play vars 7554 1726853161.57188: variable 'interface' from source: play vars 7554 1726853161.57232: Evaluated conditional (__network_wpa_supplicant_required): False 7554 1726853161.57239: when evaluation is False, skipping this task 7554 1726853161.57248: _execute() done 7554 1726853161.57254: dumping result to json 7554 1726853161.57266: done dumping result, returning 7554 1726853161.57378: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-bdc3-98b6-000000000023] 7554 1726853161.57388: sending task result for task 02083763-bbaf-bdc3-98b6-000000000023 7554 1726853161.57455: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000023 7554 1726853161.57458: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7554 1726853161.57523: no more pending results, returning what we have 7554 1726853161.57526: results queue empty 7554 1726853161.57527: checking for any_errors_fatal 7554 1726853161.57550: done checking for any_errors_fatal 7554 1726853161.57550: checking for max_fail_percentage 7554 1726853161.57552: done checking for max_fail_percentage 7554 1726853161.57553: checking to see if all hosts have failed and the running result is not ok 7554 1726853161.57554: done checking to see if all hosts have failed 7554 1726853161.57555: getting the remaining hosts for this loop 7554 1726853161.57556: done getting the remaining hosts for this loop 7554 1726853161.57560: getting the next task for host managed_node3 7554 1726853161.57566: done getting next task for host managed_node3 7554 1726853161.57572: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7554 1726853161.57575: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853161.57589: getting variables 7554 1726853161.57590: in VariableManager get_vars() 7554 1726853161.57640: Calling all_inventory to load vars for managed_node3 7554 1726853161.57643: Calling groups_inventory to load vars for managed_node3 7554 1726853161.57647: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853161.57657: Calling all_plugins_play to load vars for managed_node3 7554 1726853161.57660: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853161.57663: Calling groups_plugins_play to load vars for managed_node3 7554 1726853161.59381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853161.62336: done with get_vars() 7554 1726853161.62369: done getting variables 7554 1726853161.62555: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:26:01 -0400 (0:00:00.118) 0:00:15.593 ****** 7554 1726853161.62591: entering _queue_task() for managed_node3/service 7554 1726853161.62929: worker is 1 (out of 1 available) 7554 1726853161.62942: exiting _queue_task() for managed_node3/service 7554 1726853161.62953: done queuing things up, now waiting for results queue to drain 7554 1726853161.62955: waiting for pending results... 7554 1726853161.63243: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7554 1726853161.63479: in run() - task 02083763-bbaf-bdc3-98b6-000000000024 7554 1726853161.63483: variable 'ansible_search_path' from source: unknown 7554 1726853161.63486: variable 'ansible_search_path' from source: unknown 7554 1726853161.63489: calling self._execute() 7554 1726853161.63564: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853161.63578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853161.63596: variable 'omit' from source: magic vars 7554 1726853161.63999: variable 'ansible_distribution_major_version' from source: facts 7554 1726853161.64019: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853161.64149: variable 'network_provider' from source: set_fact 7554 1726853161.64167: Evaluated conditional (network_provider == "initscripts"): False 7554 1726853161.64179: when evaluation is False, skipping this task 7554 1726853161.64189: _execute() done 7554 1726853161.64197: dumping result to json 7554 1726853161.64316: done dumping result, returning 7554 1726853161.64321: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-bdc3-98b6-000000000024] 7554 1726853161.64323: sending task result for task 02083763-bbaf-bdc3-98b6-000000000024 7554 1726853161.64401: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000024 7554 1726853161.64404: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853161.64455: no more pending results, returning what we have 7554 1726853161.64459: results queue empty 7554 1726853161.64460: checking for any_errors_fatal 7554 1726853161.64470: done checking for any_errors_fatal 7554 1726853161.64573: checking for max_fail_percentage 7554 1726853161.64576: done checking for max_fail_percentage 7554 1726853161.64577: checking to see if all hosts have failed and the running result is not ok 7554 1726853161.64579: done checking to see if all hosts have failed 7554 1726853161.64579: getting the remaining hosts for this loop 7554 1726853161.64581: done getting the remaining hosts for this loop 7554 1726853161.64585: getting the next task for host managed_node3 7554 1726853161.64592: done getting next task for host managed_node3 7554 1726853161.64596: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7554 1726853161.64600: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853161.64618: getting variables 7554 1726853161.64620: in VariableManager get_vars() 7554 1726853161.64860: Calling all_inventory to load vars for managed_node3 7554 1726853161.64864: Calling groups_inventory to load vars for managed_node3 7554 1726853161.64866: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853161.64879: Calling all_plugins_play to load vars for managed_node3 7554 1726853161.64882: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853161.64885: Calling groups_plugins_play to load vars for managed_node3 7554 1726853161.67561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853161.69513: done with get_vars() 7554 1726853161.69535: done getting variables 7554 1726853161.69918: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:26:01 -0400 (0:00:00.073) 0:00:15.667 ****** 7554 1726853161.69958: entering _queue_task() for managed_node3/copy 7554 1726853161.70715: worker is 1 (out of 1 available) 7554 1726853161.70727: exiting _queue_task() for managed_node3/copy 7554 1726853161.70739: done queuing things up, now waiting for results queue to drain 7554 1726853161.70741: waiting for pending results... 7554 1726853161.71393: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7554 1726853161.71663: in run() - task 02083763-bbaf-bdc3-98b6-000000000025 7554 1726853161.71678: variable 'ansible_search_path' from source: unknown 7554 1726853161.71682: variable 'ansible_search_path' from source: unknown 7554 1726853161.71776: calling self._execute() 7554 1726853161.72038: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853161.72047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853161.72054: variable 'omit' from source: magic vars 7554 1726853161.72908: variable 'ansible_distribution_major_version' from source: facts 7554 1726853161.72927: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853161.73068: variable 'network_provider' from source: set_fact 7554 1726853161.73083: Evaluated conditional (network_provider == "initscripts"): False 7554 1726853161.73091: when evaluation is False, skipping this task 7554 1726853161.73098: _execute() done 7554 1726853161.73105: dumping result to json 7554 1726853161.73276: done dumping result, returning 7554 1726853161.73280: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-bdc3-98b6-000000000025] 7554 1726853161.73283: sending task result for task 02083763-bbaf-bdc3-98b6-000000000025 7554 1726853161.73359: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000025 7554 1726853161.73362: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7554 1726853161.73414: no more pending results, returning what we have 7554 1726853161.73418: results queue empty 7554 1726853161.73419: checking for any_errors_fatal 7554 1726853161.73425: done checking for any_errors_fatal 7554 1726853161.73426: checking for max_fail_percentage 7554 1726853161.73428: done checking for max_fail_percentage 7554 1726853161.73429: checking to see if all hosts have failed and the running result is not ok 7554 1726853161.73430: done checking to see if all hosts have failed 7554 1726853161.73431: getting the remaining hosts for this loop 7554 1726853161.73432: done getting the remaining hosts for this loop 7554 1726853161.73436: getting the next task for host managed_node3 7554 1726853161.73442: done getting next task for host managed_node3 7554 1726853161.73449: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7554 1726853161.73453: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853161.73468: getting variables 7554 1726853161.73469: in VariableManager get_vars() 7554 1726853161.73526: Calling all_inventory to load vars for managed_node3 7554 1726853161.73529: Calling groups_inventory to load vars for managed_node3 7554 1726853161.73532: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853161.73548: Calling all_plugins_play to load vars for managed_node3 7554 1726853161.73552: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853161.73555: Calling groups_plugins_play to load vars for managed_node3 7554 1726853161.75539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853161.78351: done with get_vars() 7554 1726853161.78429: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:26:01 -0400 (0:00:00.085) 0:00:15.752 ****** 7554 1726853161.78521: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7554 1726853161.78523: Creating lock for fedora.linux_system_roles.network_connections 7554 1726853161.78879: worker is 1 (out of 1 available) 7554 1726853161.78893: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7554 1726853161.78905: done queuing things up, now waiting for results queue to drain 7554 1726853161.78907: waiting for pending results... 7554 1726853161.79293: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7554 1726853161.79354: in run() - task 02083763-bbaf-bdc3-98b6-000000000026 7554 1726853161.79377: variable 'ansible_search_path' from source: unknown 7554 1726853161.79390: variable 'ansible_search_path' from source: unknown 7554 1726853161.79433: calling self._execute() 7554 1726853161.79696: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853161.79710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853161.79826: variable 'omit' from source: magic vars 7554 1726853161.80529: variable 'ansible_distribution_major_version' from source: facts 7554 1726853161.80602: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853161.80615: variable 'omit' from source: magic vars 7554 1726853161.80751: variable 'omit' from source: magic vars 7554 1726853161.80965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853161.83248: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853161.83322: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853161.83367: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853161.83408: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853161.83450: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853161.83543: variable 'network_provider' from source: set_fact 7554 1726853161.83694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853161.83741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853161.83777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853161.83819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853161.83858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853161.83921: variable 'omit' from source: magic vars 7554 1726853161.84053: variable 'omit' from source: magic vars 7554 1726853161.84194: variable 'network_connections' from source: task vars 7554 1726853161.84206: variable 'interface' from source: play vars 7554 1726853161.84295: variable 'interface' from source: play vars 7554 1726853161.84541: variable 'omit' from source: magic vars 7554 1726853161.84556: variable '__lsr_ansible_managed' from source: task vars 7554 1726853161.84617: variable '__lsr_ansible_managed' from source: task vars 7554 1726853161.85251: Loaded config def from plugin (lookup/template) 7554 1726853161.85263: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7554 1726853161.85375: File lookup term: get_ansible_managed.j2 7554 1726853161.85378: variable 'ansible_search_path' from source: unknown 7554 1726853161.85381: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7554 1726853161.85385: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7554 1726853161.85388: variable 'ansible_search_path' from source: unknown 7554 1726853161.91321: variable 'ansible_managed' from source: unknown 7554 1726853161.91478: variable 'omit' from source: magic vars 7554 1726853161.91676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853161.91680: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853161.91682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853161.91684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853161.91686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853161.91688: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853161.91690: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853161.91692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853161.91730: Set connection var ansible_shell_executable to /bin/sh 7554 1726853161.91743: Set connection var ansible_pipelining to False 7554 1726853161.91752: Set connection var ansible_shell_type to sh 7554 1726853161.91758: Set connection var ansible_connection to ssh 7554 1726853161.91774: Set connection var ansible_timeout to 10 7554 1726853161.91784: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853161.91812: variable 'ansible_shell_executable' from source: unknown 7554 1726853161.91819: variable 'ansible_connection' from source: unknown 7554 1726853161.91826: variable 'ansible_module_compression' from source: unknown 7554 1726853161.91832: variable 'ansible_shell_type' from source: unknown 7554 1726853161.91838: variable 'ansible_shell_executable' from source: unknown 7554 1726853161.91844: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853161.91855: variable 'ansible_pipelining' from source: unknown 7554 1726853161.91861: variable 'ansible_timeout' from source: unknown 7554 1726853161.91868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853161.92001: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853161.92030: variable 'omit' from source: magic vars 7554 1726853161.92040: starting attempt loop 7554 1726853161.92050: running the handler 7554 1726853161.92066: _low_level_execute_command(): starting 7554 1726853161.92080: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853161.92799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853161.92889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853161.92904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853161.92919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853161.93018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853161.94744: stdout chunk (state=3): >>>/root <<< 7554 1726853161.94888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853161.94925: stdout chunk (state=3): >>><<< 7554 1726853161.94928: stderr chunk (state=3): >>><<< 7554 1726853161.94950: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853161.94969: _low_level_execute_command(): starting 7554 1726853161.95058: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853161.94957-8181-251406519028454 `" && echo ansible-tmp-1726853161.94957-8181-251406519028454="` echo /root/.ansible/tmp/ansible-tmp-1726853161.94957-8181-251406519028454 `" ) && sleep 0' 7554 1726853161.95588: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853161.95605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853161.95622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853161.95686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853161.95736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853161.95752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853161.95776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853161.95869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853161.97877: stdout chunk (state=3): >>>ansible-tmp-1726853161.94957-8181-251406519028454=/root/.ansible/tmp/ansible-tmp-1726853161.94957-8181-251406519028454 <<< 7554 1726853161.98015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853161.98019: stderr chunk (state=3): >>><<< 7554 1726853161.98022: stdout chunk (state=3): >>><<< 7554 1726853161.98078: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853161.94957-8181-251406519028454=/root/.ansible/tmp/ansible-tmp-1726853161.94957-8181-251406519028454 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853161.98104: variable 'ansible_module_compression' from source: unknown 7554 1726853161.98189: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 7554 1726853161.98192: ANSIBALLZ: Acquiring lock 7554 1726853161.98195: ANSIBALLZ: Lock acquired: 140257824543312 7554 1726853161.98197: ANSIBALLZ: Creating module 7554 1726853162.19905: ANSIBALLZ: Writing module into payload 7554 1726853162.20430: ANSIBALLZ: Writing module 7554 1726853162.20455: ANSIBALLZ: Renaming module 7554 1726853162.20509: ANSIBALLZ: Done creating module 7554 1726853162.20549: variable 'ansible_facts' from source: unknown 7554 1726853162.20711: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853161.94957-8181-251406519028454/AnsiballZ_network_connections.py 7554 1726853162.20942: Sending initial data 7554 1726853162.20946: Sent initial data (164 bytes) 7554 1726853162.22076: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853162.22139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853162.22145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853162.22204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853162.22309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853162.23977: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7554 1726853162.23994: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 7554 1726853162.24006: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 7554 1726853162.24017: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 7554 1726853162.24032: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 7554 1726853162.24050: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 7554 1726853162.24066: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853162.24160: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853162.24249: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpcir4m7t3 /root/.ansible/tmp/ansible-tmp-1726853161.94957-8181-251406519028454/AnsiballZ_network_connections.py <<< 7554 1726853162.24262: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853161.94957-8181-251406519028454/AnsiballZ_network_connections.py" <<< 7554 1726853162.24342: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpcir4m7t3" to remote "/root/.ansible/tmp/ansible-tmp-1726853161.94957-8181-251406519028454/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853161.94957-8181-251406519028454/AnsiballZ_network_connections.py" <<< 7554 1726853162.25955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853162.26128: stderr chunk (state=3): >>><<< 7554 1726853162.26131: stdout chunk (state=3): >>><<< 7554 1726853162.26133: done transferring module to remote 7554 1726853162.26135: _low_level_execute_command(): starting 7554 1726853162.26137: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853161.94957-8181-251406519028454/ /root/.ansible/tmp/ansible-tmp-1726853161.94957-8181-251406519028454/AnsiballZ_network_connections.py && sleep 0' 7554 1726853162.26786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853162.26834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853162.26859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853162.26886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853162.26969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853162.28901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853162.28905: stdout chunk (state=3): >>><<< 7554 1726853162.28907: stderr chunk (state=3): >>><<< 7554 1726853162.28931: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853162.28940: _low_level_execute_command(): starting 7554 1726853162.28953: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853161.94957-8181-251406519028454/AnsiballZ_network_connections.py && sleep 0' 7554 1726853162.29566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853162.29583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853162.29600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853162.29626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853162.29646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853162.29686: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853162.29760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853162.29796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853162.29806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853162.29905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853163.01497: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 3c89e0d2-18d0-4f1d-897d-821d98d74a63\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 3c89e0d2-18d0-4f1d-897d-821d98d74a63 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7554 1726853163.03611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853163.03638: stderr chunk (state=3): >>><<< 7554 1726853163.03643: stdout chunk (state=3): >>><<< 7554 1726853163.03658: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 3c89e0d2-18d0-4f1d-897d-821d98d74a63\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 3c89e0d2-18d0-4f1d-897d-821d98d74a63 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853163.03695: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'auto_gateway': True, 'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/64', '203.0.113.2/24'], 'gateway6': '2001:db8::1', 'gateway4': '203.0.113.1', 'route_metric4': 65535}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853161.94957-8181-251406519028454/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853163.03703: _low_level_execute_command(): starting 7554 1726853163.03708: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853161.94957-8181-251406519028454/ > /dev/null 2>&1 && sleep 0' 7554 1726853163.04153: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853163.04156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853163.04159: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853163.04161: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853163.04163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853163.04215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853163.04218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853163.04222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853163.04287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853163.06208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853163.06235: stderr chunk (state=3): >>><<< 7554 1726853163.06238: stdout chunk (state=3): >>><<< 7554 1726853163.06253: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853163.06259: handler run complete 7554 1726853163.06286: attempt loop complete, returning result 7554 1726853163.06289: _execute() done 7554 1726853163.06291: dumping result to json 7554 1726853163.06296: done dumping result, returning 7554 1726853163.06305: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-bdc3-98b6-000000000026] 7554 1726853163.06309: sending task result for task 02083763-bbaf-bdc3-98b6-000000000026 7554 1726853163.06412: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000026 7554 1726853163.06415: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": true, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1", "route_metric4": 65535 }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 3c89e0d2-18d0-4f1d-897d-821d98d74a63 [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 3c89e0d2-18d0-4f1d-897d-821d98d74a63 (not-active) 7554 1726853163.06542: no more pending results, returning what we have 7554 1726853163.06547: results queue empty 7554 1726853163.06548: checking for any_errors_fatal 7554 1726853163.06555: done checking for any_errors_fatal 7554 1726853163.06556: checking for max_fail_percentage 7554 1726853163.06557: done checking for max_fail_percentage 7554 1726853163.06558: checking to see if all hosts have failed and the running result is not ok 7554 1726853163.06559: done checking to see if all hosts have failed 7554 1726853163.06560: getting the remaining hosts for this loop 7554 1726853163.06561: done getting the remaining hosts for this loop 7554 1726853163.06564: getting the next task for host managed_node3 7554 1726853163.06569: done getting next task for host managed_node3 7554 1726853163.06574: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7554 1726853163.06577: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853163.06586: getting variables 7554 1726853163.06588: in VariableManager get_vars() 7554 1726853163.06633: Calling all_inventory to load vars for managed_node3 7554 1726853163.06636: Calling groups_inventory to load vars for managed_node3 7554 1726853163.06638: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853163.06649: Calling all_plugins_play to load vars for managed_node3 7554 1726853163.06652: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853163.06654: Calling groups_plugins_play to load vars for managed_node3 7554 1726853163.07587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853163.08453: done with get_vars() 7554 1726853163.08468: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:26:03 -0400 (0:00:01.300) 0:00:17.052 ****** 7554 1726853163.08531: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7554 1726853163.08532: Creating lock for fedora.linux_system_roles.network_state 7554 1726853163.08762: worker is 1 (out of 1 available) 7554 1726853163.08778: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7554 1726853163.08791: done queuing things up, now waiting for results queue to drain 7554 1726853163.08792: waiting for pending results... 7554 1726853163.08965: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7554 1726853163.09057: in run() - task 02083763-bbaf-bdc3-98b6-000000000027 7554 1726853163.09069: variable 'ansible_search_path' from source: unknown 7554 1726853163.09074: variable 'ansible_search_path' from source: unknown 7554 1726853163.09104: calling self._execute() 7554 1726853163.09175: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853163.09179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853163.09189: variable 'omit' from source: magic vars 7554 1726853163.09577: variable 'ansible_distribution_major_version' from source: facts 7554 1726853163.09580: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853163.09670: variable 'network_state' from source: role '' defaults 7554 1726853163.09690: Evaluated conditional (network_state != {}): False 7554 1726853163.09698: when evaluation is False, skipping this task 7554 1726853163.09704: _execute() done 7554 1726853163.09711: dumping result to json 7554 1726853163.09718: done dumping result, returning 7554 1726853163.09728: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-bdc3-98b6-000000000027] 7554 1726853163.09738: sending task result for task 02083763-bbaf-bdc3-98b6-000000000027 7554 1726853163.09977: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000027 7554 1726853163.09981: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853163.10028: no more pending results, returning what we have 7554 1726853163.10031: results queue empty 7554 1726853163.10032: checking for any_errors_fatal 7554 1726853163.10040: done checking for any_errors_fatal 7554 1726853163.10041: checking for max_fail_percentage 7554 1726853163.10042: done checking for max_fail_percentage 7554 1726853163.10043: checking to see if all hosts have failed and the running result is not ok 7554 1726853163.10044: done checking to see if all hosts have failed 7554 1726853163.10045: getting the remaining hosts for this loop 7554 1726853163.10046: done getting the remaining hosts for this loop 7554 1726853163.10049: getting the next task for host managed_node3 7554 1726853163.10055: done getting next task for host managed_node3 7554 1726853163.10058: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7554 1726853163.10062: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853163.10078: getting variables 7554 1726853163.10079: in VariableManager get_vars() 7554 1726853163.10181: Calling all_inventory to load vars for managed_node3 7554 1726853163.10183: Calling groups_inventory to load vars for managed_node3 7554 1726853163.10186: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853163.10309: Calling all_plugins_play to load vars for managed_node3 7554 1726853163.10312: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853163.10315: Calling groups_plugins_play to load vars for managed_node3 7554 1726853163.11081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853163.11931: done with get_vars() 7554 1726853163.11951: done getting variables 7554 1726853163.11998: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:26:03 -0400 (0:00:00.034) 0:00:17.087 ****** 7554 1726853163.12022: entering _queue_task() for managed_node3/debug 7554 1726853163.12266: worker is 1 (out of 1 available) 7554 1726853163.12281: exiting _queue_task() for managed_node3/debug 7554 1726853163.12294: done queuing things up, now waiting for results queue to drain 7554 1726853163.12295: waiting for pending results... 7554 1726853163.12611: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7554 1726853163.12819: in run() - task 02083763-bbaf-bdc3-98b6-000000000028 7554 1726853163.12842: variable 'ansible_search_path' from source: unknown 7554 1726853163.12854: variable 'ansible_search_path' from source: unknown 7554 1726853163.12912: calling self._execute() 7554 1726853163.13016: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853163.13030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853163.13065: variable 'omit' from source: magic vars 7554 1726853163.13477: variable 'ansible_distribution_major_version' from source: facts 7554 1726853163.13495: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853163.13576: variable 'omit' from source: magic vars 7554 1726853163.13579: variable 'omit' from source: magic vars 7554 1726853163.13626: variable 'omit' from source: magic vars 7554 1726853163.13680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853163.13728: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853163.13763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853163.13788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853163.13806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853163.13861: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853163.13883: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853163.13893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853163.14033: Set connection var ansible_shell_executable to /bin/sh 7554 1726853163.14078: Set connection var ansible_pipelining to False 7554 1726853163.14275: Set connection var ansible_shell_type to sh 7554 1726853163.14279: Set connection var ansible_connection to ssh 7554 1726853163.14282: Set connection var ansible_timeout to 10 7554 1726853163.14284: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853163.14285: variable 'ansible_shell_executable' from source: unknown 7554 1726853163.14287: variable 'ansible_connection' from source: unknown 7554 1726853163.14290: variable 'ansible_module_compression' from source: unknown 7554 1726853163.14291: variable 'ansible_shell_type' from source: unknown 7554 1726853163.14293: variable 'ansible_shell_executable' from source: unknown 7554 1726853163.14295: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853163.14296: variable 'ansible_pipelining' from source: unknown 7554 1726853163.14298: variable 'ansible_timeout' from source: unknown 7554 1726853163.14300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853163.14331: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853163.14350: variable 'omit' from source: magic vars 7554 1726853163.14361: starting attempt loop 7554 1726853163.14368: running the handler 7554 1726853163.14503: variable '__network_connections_result' from source: set_fact 7554 1726853163.14565: handler run complete 7554 1726853163.14591: attempt loop complete, returning result 7554 1726853163.14599: _execute() done 7554 1726853163.14605: dumping result to json 7554 1726853163.14612: done dumping result, returning 7554 1726853163.14626: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-bdc3-98b6-000000000028] 7554 1726853163.14635: sending task result for task 02083763-bbaf-bdc3-98b6-000000000028 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 3c89e0d2-18d0-4f1d-897d-821d98d74a63", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 3c89e0d2-18d0-4f1d-897d-821d98d74a63 (not-active)" ] } 7554 1726853163.14817: no more pending results, returning what we have 7554 1726853163.14820: results queue empty 7554 1726853163.14821: checking for any_errors_fatal 7554 1726853163.14828: done checking for any_errors_fatal 7554 1726853163.14828: checking for max_fail_percentage 7554 1726853163.14830: done checking for max_fail_percentage 7554 1726853163.14830: checking to see if all hosts have failed and the running result is not ok 7554 1726853163.14831: done checking to see if all hosts have failed 7554 1726853163.14832: getting the remaining hosts for this loop 7554 1726853163.14833: done getting the remaining hosts for this loop 7554 1726853163.14837: getting the next task for host managed_node3 7554 1726853163.14842: done getting next task for host managed_node3 7554 1726853163.14846: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7554 1726853163.14851: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853163.14863: getting variables 7554 1726853163.14865: in VariableManager get_vars() 7554 1726853163.14913: Calling all_inventory to load vars for managed_node3 7554 1726853163.14915: Calling groups_inventory to load vars for managed_node3 7554 1726853163.14917: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853163.14926: Calling all_plugins_play to load vars for managed_node3 7554 1726853163.14929: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853163.14931: Calling groups_plugins_play to load vars for managed_node3 7554 1726853163.15490: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000028 7554 1726853163.15494: WORKER PROCESS EXITING 7554 1726853163.16034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853163.16880: done with get_vars() 7554 1726853163.16896: done getting variables 7554 1726853163.16940: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:26:03 -0400 (0:00:00.049) 0:00:17.137 ****** 7554 1726853163.16966: entering _queue_task() for managed_node3/debug 7554 1726853163.17266: worker is 1 (out of 1 available) 7554 1726853163.17280: exiting _queue_task() for managed_node3/debug 7554 1726853163.17293: done queuing things up, now waiting for results queue to drain 7554 1726853163.17294: waiting for pending results... 7554 1726853163.17690: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7554 1726853163.17739: in run() - task 02083763-bbaf-bdc3-98b6-000000000029 7554 1726853163.17763: variable 'ansible_search_path' from source: unknown 7554 1726853163.17770: variable 'ansible_search_path' from source: unknown 7554 1726853163.17815: calling self._execute() 7554 1726853163.17920: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853163.17933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853163.17952: variable 'omit' from source: magic vars 7554 1726853163.18325: variable 'ansible_distribution_major_version' from source: facts 7554 1726853163.18334: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853163.18343: variable 'omit' from source: magic vars 7554 1726853163.18384: variable 'omit' from source: magic vars 7554 1726853163.18409: variable 'omit' from source: magic vars 7554 1726853163.18449: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853163.18480: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853163.18495: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853163.18508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853163.18518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853163.18542: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853163.18547: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853163.18549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853163.18620: Set connection var ansible_shell_executable to /bin/sh 7554 1726853163.18627: Set connection var ansible_pipelining to False 7554 1726853163.18630: Set connection var ansible_shell_type to sh 7554 1726853163.18632: Set connection var ansible_connection to ssh 7554 1726853163.18639: Set connection var ansible_timeout to 10 7554 1726853163.18644: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853163.18662: variable 'ansible_shell_executable' from source: unknown 7554 1726853163.18668: variable 'ansible_connection' from source: unknown 7554 1726853163.18672: variable 'ansible_module_compression' from source: unknown 7554 1726853163.18675: variable 'ansible_shell_type' from source: unknown 7554 1726853163.18677: variable 'ansible_shell_executable' from source: unknown 7554 1726853163.18684: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853163.18686: variable 'ansible_pipelining' from source: unknown 7554 1726853163.18688: variable 'ansible_timeout' from source: unknown 7554 1726853163.18690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853163.18794: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853163.18805: variable 'omit' from source: magic vars 7554 1726853163.18808: starting attempt loop 7554 1726853163.18811: running the handler 7554 1726853163.18850: variable '__network_connections_result' from source: set_fact 7554 1726853163.18906: variable '__network_connections_result' from source: set_fact 7554 1726853163.18997: handler run complete 7554 1726853163.19019: attempt loop complete, returning result 7554 1726853163.19024: _execute() done 7554 1726853163.19027: dumping result to json 7554 1726853163.19029: done dumping result, returning 7554 1726853163.19038: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-bdc3-98b6-000000000029] 7554 1726853163.19044: sending task result for task 02083763-bbaf-bdc3-98b6-000000000029 7554 1726853163.19132: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000029 7554 1726853163.19135: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": true, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1", "route_metric4": 65535 }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 3c89e0d2-18d0-4f1d-897d-821d98d74a63\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 3c89e0d2-18d0-4f1d-897d-821d98d74a63 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 3c89e0d2-18d0-4f1d-897d-821d98d74a63", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 3c89e0d2-18d0-4f1d-897d-821d98d74a63 (not-active)" ] } } 7554 1726853163.19222: no more pending results, returning what we have 7554 1726853163.19225: results queue empty 7554 1726853163.19226: checking for any_errors_fatal 7554 1726853163.19232: done checking for any_errors_fatal 7554 1726853163.19232: checking for max_fail_percentage 7554 1726853163.19234: done checking for max_fail_percentage 7554 1726853163.19234: checking to see if all hosts have failed and the running result is not ok 7554 1726853163.19235: done checking to see if all hosts have failed 7554 1726853163.19236: getting the remaining hosts for this loop 7554 1726853163.19237: done getting the remaining hosts for this loop 7554 1726853163.19241: getting the next task for host managed_node3 7554 1726853163.19248: done getting next task for host managed_node3 7554 1726853163.19252: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7554 1726853163.19254: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853163.19264: getting variables 7554 1726853163.19273: in VariableManager get_vars() 7554 1726853163.19311: Calling all_inventory to load vars for managed_node3 7554 1726853163.19313: Calling groups_inventory to load vars for managed_node3 7554 1726853163.19315: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853163.19323: Calling all_plugins_play to load vars for managed_node3 7554 1726853163.19325: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853163.19327: Calling groups_plugins_play to load vars for managed_node3 7554 1726853163.20099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853163.21495: done with get_vars() 7554 1726853163.21533: done getting variables 7554 1726853163.21638: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:26:03 -0400 (0:00:00.047) 0:00:17.184 ****** 7554 1726853163.21677: entering _queue_task() for managed_node3/debug 7554 1726853163.21964: worker is 1 (out of 1 available) 7554 1726853163.21980: exiting _queue_task() for managed_node3/debug 7554 1726853163.21991: done queuing things up, now waiting for results queue to drain 7554 1726853163.21992: waiting for pending results... 7554 1726853163.22168: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7554 1726853163.22257: in run() - task 02083763-bbaf-bdc3-98b6-00000000002a 7554 1726853163.22270: variable 'ansible_search_path' from source: unknown 7554 1726853163.22275: variable 'ansible_search_path' from source: unknown 7554 1726853163.22303: calling self._execute() 7554 1726853163.22378: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853163.22382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853163.22390: variable 'omit' from source: magic vars 7554 1726853163.22664: variable 'ansible_distribution_major_version' from source: facts 7554 1726853163.22676: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853163.22767: variable 'network_state' from source: role '' defaults 7554 1726853163.22772: Evaluated conditional (network_state != {}): False 7554 1726853163.22775: when evaluation is False, skipping this task 7554 1726853163.22778: _execute() done 7554 1726853163.22780: dumping result to json 7554 1726853163.22782: done dumping result, returning 7554 1726853163.22790: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-bdc3-98b6-00000000002a] 7554 1726853163.22795: sending task result for task 02083763-bbaf-bdc3-98b6-00000000002a 7554 1726853163.22883: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000002a 7554 1726853163.22885: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7554 1726853163.22929: no more pending results, returning what we have 7554 1726853163.22933: results queue empty 7554 1726853163.22933: checking for any_errors_fatal 7554 1726853163.22943: done checking for any_errors_fatal 7554 1726853163.22946: checking for max_fail_percentage 7554 1726853163.22948: done checking for max_fail_percentage 7554 1726853163.22948: checking to see if all hosts have failed and the running result is not ok 7554 1726853163.22949: done checking to see if all hosts have failed 7554 1726853163.22950: getting the remaining hosts for this loop 7554 1726853163.22951: done getting the remaining hosts for this loop 7554 1726853163.22955: getting the next task for host managed_node3 7554 1726853163.22961: done getting next task for host managed_node3 7554 1726853163.22965: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7554 1726853163.22968: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853163.22988: getting variables 7554 1726853163.22990: in VariableManager get_vars() 7554 1726853163.23034: Calling all_inventory to load vars for managed_node3 7554 1726853163.23037: Calling groups_inventory to load vars for managed_node3 7554 1726853163.23039: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853163.23049: Calling all_plugins_play to load vars for managed_node3 7554 1726853163.23051: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853163.23054: Calling groups_plugins_play to load vars for managed_node3 7554 1726853163.24210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853163.25576: done with get_vars() 7554 1726853163.25595: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:26:03 -0400 (0:00:00.039) 0:00:17.224 ****** 7554 1726853163.25666: entering _queue_task() for managed_node3/ping 7554 1726853163.25668: Creating lock for ping 7554 1726853163.25927: worker is 1 (out of 1 available) 7554 1726853163.25947: exiting _queue_task() for managed_node3/ping 7554 1726853163.25958: done queuing things up, now waiting for results queue to drain 7554 1726853163.25960: waiting for pending results... 7554 1726853163.26132: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7554 1726853163.26226: in run() - task 02083763-bbaf-bdc3-98b6-00000000002b 7554 1726853163.26238: variable 'ansible_search_path' from source: unknown 7554 1726853163.26242: variable 'ansible_search_path' from source: unknown 7554 1726853163.26270: calling self._execute() 7554 1726853163.26348: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853163.26351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853163.26359: variable 'omit' from source: magic vars 7554 1726853163.26629: variable 'ansible_distribution_major_version' from source: facts 7554 1726853163.26638: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853163.26647: variable 'omit' from source: magic vars 7554 1726853163.26686: variable 'omit' from source: magic vars 7554 1726853163.26710: variable 'omit' from source: magic vars 7554 1726853163.26747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853163.26775: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853163.26790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853163.26803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853163.26813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853163.26841: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853163.26847: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853163.26850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853163.26913: Set connection var ansible_shell_executable to /bin/sh 7554 1726853163.26920: Set connection var ansible_pipelining to False 7554 1726853163.26923: Set connection var ansible_shell_type to sh 7554 1726853163.26925: Set connection var ansible_connection to ssh 7554 1726853163.26933: Set connection var ansible_timeout to 10 7554 1726853163.26937: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853163.26956: variable 'ansible_shell_executable' from source: unknown 7554 1726853163.26961: variable 'ansible_connection' from source: unknown 7554 1726853163.26963: variable 'ansible_module_compression' from source: unknown 7554 1726853163.26966: variable 'ansible_shell_type' from source: unknown 7554 1726853163.26968: variable 'ansible_shell_executable' from source: unknown 7554 1726853163.26970: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853163.26974: variable 'ansible_pipelining' from source: unknown 7554 1726853163.26976: variable 'ansible_timeout' from source: unknown 7554 1726853163.26981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853163.27124: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853163.27134: variable 'omit' from source: magic vars 7554 1726853163.27138: starting attempt loop 7554 1726853163.27140: running the handler 7554 1726853163.27154: _low_level_execute_command(): starting 7554 1726853163.27162: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853163.27796: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853163.27877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853163.27893: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853163.27922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853163.27938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853163.27970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853163.28067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853163.29780: stdout chunk (state=3): >>>/root <<< 7554 1726853163.29881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853163.29907: stderr chunk (state=3): >>><<< 7554 1726853163.29910: stdout chunk (state=3): >>><<< 7554 1726853163.29929: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853163.29947: _low_level_execute_command(): starting 7554 1726853163.29951: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853163.2992985-8246-229290854424204 `" && echo ansible-tmp-1726853163.2992985-8246-229290854424204="` echo /root/.ansible/tmp/ansible-tmp-1726853163.2992985-8246-229290854424204 `" ) && sleep 0' 7554 1726853163.30388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853163.30392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853163.30395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853163.30405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853163.30408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853163.30450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853163.30453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853163.30459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853163.30519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853163.32511: stdout chunk (state=3): >>>ansible-tmp-1726853163.2992985-8246-229290854424204=/root/.ansible/tmp/ansible-tmp-1726853163.2992985-8246-229290854424204 <<< 7554 1726853163.32616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853163.32643: stderr chunk (state=3): >>><<< 7554 1726853163.32646: stdout chunk (state=3): >>><<< 7554 1726853163.32662: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853163.2992985-8246-229290854424204=/root/.ansible/tmp/ansible-tmp-1726853163.2992985-8246-229290854424204 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853163.32710: variable 'ansible_module_compression' from source: unknown 7554 1726853163.32744: ANSIBALLZ: Using lock for ping 7554 1726853163.32747: ANSIBALLZ: Acquiring lock 7554 1726853163.32752: ANSIBALLZ: Lock acquired: 140257820802320 7554 1726853163.32754: ANSIBALLZ: Creating module 7554 1726853163.40231: ANSIBALLZ: Writing module into payload 7554 1726853163.40274: ANSIBALLZ: Writing module 7554 1726853163.40292: ANSIBALLZ: Renaming module 7554 1726853163.40304: ANSIBALLZ: Done creating module 7554 1726853163.40314: variable 'ansible_facts' from source: unknown 7554 1726853163.40360: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853163.2992985-8246-229290854424204/AnsiballZ_ping.py 7554 1726853163.40463: Sending initial data 7554 1726853163.40466: Sent initial data (151 bytes) 7554 1726853163.40934: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853163.40937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853163.40940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853163.40942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853163.40944: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853163.41000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853163.41003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853163.41005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853163.41078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853163.42744: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7554 1726853163.42747: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853163.42802: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853163.42862: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp9xguj8cm /root/.ansible/tmp/ansible-tmp-1726853163.2992985-8246-229290854424204/AnsiballZ_ping.py <<< 7554 1726853163.42868: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853163.2992985-8246-229290854424204/AnsiballZ_ping.py" <<< 7554 1726853163.42924: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp9xguj8cm" to remote "/root/.ansible/tmp/ansible-tmp-1726853163.2992985-8246-229290854424204/AnsiballZ_ping.py" <<< 7554 1726853163.42927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853163.2992985-8246-229290854424204/AnsiballZ_ping.py" <<< 7554 1726853163.43500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853163.43543: stderr chunk (state=3): >>><<< 7554 1726853163.43546: stdout chunk (state=3): >>><<< 7554 1726853163.43575: done transferring module to remote 7554 1726853163.43584: _low_level_execute_command(): starting 7554 1726853163.43589: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853163.2992985-8246-229290854424204/ /root/.ansible/tmp/ansible-tmp-1726853163.2992985-8246-229290854424204/AnsiballZ_ping.py && sleep 0' 7554 1726853163.44036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853163.44039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853163.44042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853163.44046: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853163.44053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853163.44055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853163.44112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853163.44115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853163.44163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853163.46054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853163.46057: stdout chunk (state=3): >>><<< 7554 1726853163.46059: stderr chunk (state=3): >>><<< 7554 1726853163.46153: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853163.46156: _low_level_execute_command(): starting 7554 1726853163.46159: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853163.2992985-8246-229290854424204/AnsiballZ_ping.py && sleep 0' 7554 1726853163.46623: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853163.46643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853163.46689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853163.46701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853163.46775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853163.62459: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7554 1726853163.63913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853163.63917: stdout chunk (state=3): >>><<< 7554 1726853163.63920: stderr chunk (state=3): >>><<< 7554 1726853163.63940: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853163.64049: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853163.2992985-8246-229290854424204/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853163.64052: _low_level_execute_command(): starting 7554 1726853163.64054: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853163.2992985-8246-229290854424204/ > /dev/null 2>&1 && sleep 0' 7554 1726853163.64660: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853163.64679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853163.64696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853163.64714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853163.64740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853163.64795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853163.64875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853163.64902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853163.64999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853163.66926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853163.66948: stderr chunk (state=3): >>><<< 7554 1726853163.66964: stdout chunk (state=3): >>><<< 7554 1726853163.67077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853163.67080: handler run complete 7554 1726853163.67083: attempt loop complete, returning result 7554 1726853163.67085: _execute() done 7554 1726853163.67087: dumping result to json 7554 1726853163.67089: done dumping result, returning 7554 1726853163.67091: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-bdc3-98b6-00000000002b] 7554 1726853163.67094: sending task result for task 02083763-bbaf-bdc3-98b6-00000000002b 7554 1726853163.67163: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000002b 7554 1726853163.67166: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7554 1726853163.67232: no more pending results, returning what we have 7554 1726853163.67236: results queue empty 7554 1726853163.67236: checking for any_errors_fatal 7554 1726853163.67246: done checking for any_errors_fatal 7554 1726853163.67247: checking for max_fail_percentage 7554 1726853163.67249: done checking for max_fail_percentage 7554 1726853163.67250: checking to see if all hosts have failed and the running result is not ok 7554 1726853163.67251: done checking to see if all hosts have failed 7554 1726853163.67252: getting the remaining hosts for this loop 7554 1726853163.67253: done getting the remaining hosts for this loop 7554 1726853163.67257: getting the next task for host managed_node3 7554 1726853163.67268: done getting next task for host managed_node3 7554 1726853163.67377: ^ task is: TASK: meta (role_complete) 7554 1726853163.67382: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853163.67394: getting variables 7554 1726853163.67397: in VariableManager get_vars() 7554 1726853163.67453: Calling all_inventory to load vars for managed_node3 7554 1726853163.67456: Calling groups_inventory to load vars for managed_node3 7554 1726853163.67459: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853163.67469: Calling all_plugins_play to load vars for managed_node3 7554 1726853163.67585: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853163.67594: Calling groups_plugins_play to load vars for managed_node3 7554 1726853163.69342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853163.71006: done with get_vars() 7554 1726853163.71030: done getting variables 7554 1726853163.71114: done queuing things up, now waiting for results queue to drain 7554 1726853163.71117: results queue empty 7554 1726853163.71117: checking for any_errors_fatal 7554 1726853163.71120: done checking for any_errors_fatal 7554 1726853163.71121: checking for max_fail_percentage 7554 1726853163.71122: done checking for max_fail_percentage 7554 1726853163.71123: checking to see if all hosts have failed and the running result is not ok 7554 1726853163.71123: done checking to see if all hosts have failed 7554 1726853163.71124: getting the remaining hosts for this loop 7554 1726853163.71125: done getting the remaining hosts for this loop 7554 1726853163.71128: getting the next task for host managed_node3 7554 1726853163.71131: done getting next task for host managed_node3 7554 1726853163.71134: ^ task is: TASK: Include the task 'assert_device_present.yml' 7554 1726853163.71135: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853163.71138: getting variables 7554 1726853163.71139: in VariableManager get_vars() 7554 1726853163.71156: Calling all_inventory to load vars for managed_node3 7554 1726853163.71158: Calling groups_inventory to load vars for managed_node3 7554 1726853163.71160: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853163.71165: Calling all_plugins_play to load vars for managed_node3 7554 1726853163.71168: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853163.71170: Calling groups_plugins_play to load vars for managed_node3 7554 1726853163.73743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853163.76714: done with get_vars() 7554 1726853163.76743: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:42 Friday 20 September 2024 13:26:03 -0400 (0:00:00.513) 0:00:17.737 ****** 7554 1726853163.77029: entering _queue_task() for managed_node3/include_tasks 7554 1726853163.77463: worker is 1 (out of 1 available) 7554 1726853163.77979: exiting _queue_task() for managed_node3/include_tasks 7554 1726853163.77990: done queuing things up, now waiting for results queue to drain 7554 1726853163.77992: waiting for pending results... 7554 1726853163.78390: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 7554 1726853163.78395: in run() - task 02083763-bbaf-bdc3-98b6-00000000005b 7554 1726853163.78399: variable 'ansible_search_path' from source: unknown 7554 1726853163.78436: calling self._execute() 7554 1726853163.78678: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853163.78681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853163.78812: variable 'omit' from source: magic vars 7554 1726853163.79477: variable 'ansible_distribution_major_version' from source: facts 7554 1726853163.79499: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853163.79514: _execute() done 7554 1726853163.79523: dumping result to json 7554 1726853163.79530: done dumping result, returning 7554 1726853163.79542: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [02083763-bbaf-bdc3-98b6-00000000005b] 7554 1726853163.79553: sending task result for task 02083763-bbaf-bdc3-98b6-00000000005b 7554 1726853163.79724: no more pending results, returning what we have 7554 1726853163.79729: in VariableManager get_vars() 7554 1726853163.79793: Calling all_inventory to load vars for managed_node3 7554 1726853163.79795: Calling groups_inventory to load vars for managed_node3 7554 1726853163.79798: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853163.79812: Calling all_plugins_play to load vars for managed_node3 7554 1726853163.79816: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853163.79819: Calling groups_plugins_play to load vars for managed_node3 7554 1726853163.80516: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000005b 7554 1726853163.80520: WORKER PROCESS EXITING 7554 1726853163.81569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853163.83017: done with get_vars() 7554 1726853163.83042: variable 'ansible_search_path' from source: unknown 7554 1726853163.83058: we have included files to process 7554 1726853163.83060: generating all_blocks data 7554 1726853163.83062: done generating all_blocks data 7554 1726853163.83067: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7554 1726853163.83068: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7554 1726853163.83072: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7554 1726853163.83184: in VariableManager get_vars() 7554 1726853163.83214: done with get_vars() 7554 1726853163.83327: done processing included file 7554 1726853163.83329: iterating over new_blocks loaded from include file 7554 1726853163.83331: in VariableManager get_vars() 7554 1726853163.83352: done with get_vars() 7554 1726853163.83354: filtering new block on tags 7554 1726853163.83373: done filtering new block on tags 7554 1726853163.83376: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 7554 1726853163.83381: extending task lists for all hosts with included blocks 7554 1726853163.90993: done extending task lists 7554 1726853163.90996: done processing included files 7554 1726853163.90997: results queue empty 7554 1726853163.90997: checking for any_errors_fatal 7554 1726853163.90999: done checking for any_errors_fatal 7554 1726853163.91000: checking for max_fail_percentage 7554 1726853163.91001: done checking for max_fail_percentage 7554 1726853163.91002: checking to see if all hosts have failed and the running result is not ok 7554 1726853163.91003: done checking to see if all hosts have failed 7554 1726853163.91004: getting the remaining hosts for this loop 7554 1726853163.91005: done getting the remaining hosts for this loop 7554 1726853163.91008: getting the next task for host managed_node3 7554 1726853163.91011: done getting next task for host managed_node3 7554 1726853163.91014: ^ task is: TASK: Include the task 'get_interface_stat.yml' 7554 1726853163.91016: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853163.91019: getting variables 7554 1726853163.91020: in VariableManager get_vars() 7554 1726853163.91043: Calling all_inventory to load vars for managed_node3 7554 1726853163.91046: Calling groups_inventory to load vars for managed_node3 7554 1726853163.91048: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853163.91055: Calling all_plugins_play to load vars for managed_node3 7554 1726853163.91058: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853163.91098: Calling groups_plugins_play to load vars for managed_node3 7554 1726853163.92797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853164.04500: done with get_vars() 7554 1726853164.04526: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:26:04 -0400 (0:00:00.276) 0:00:18.014 ****** 7554 1726853164.04714: entering _queue_task() for managed_node3/include_tasks 7554 1726853164.05489: worker is 1 (out of 1 available) 7554 1726853164.05501: exiting _queue_task() for managed_node3/include_tasks 7554 1726853164.05517: done queuing things up, now waiting for results queue to drain 7554 1726853164.05520: waiting for pending results... 7554 1726853164.05947: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 7554 1726853164.06160: in run() - task 02083763-bbaf-bdc3-98b6-0000000008c2 7554 1726853164.06230: variable 'ansible_search_path' from source: unknown 7554 1726853164.06239: variable 'ansible_search_path' from source: unknown 7554 1726853164.06475: calling self._execute() 7554 1726853164.06586: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853164.06600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853164.06617: variable 'omit' from source: magic vars 7554 1726853164.07413: variable 'ansible_distribution_major_version' from source: facts 7554 1726853164.07480: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853164.07530: _execute() done 7554 1726853164.07538: dumping result to json 7554 1726853164.07545: done dumping result, returning 7554 1726853164.07556: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-bdc3-98b6-0000000008c2] 7554 1726853164.07739: sending task result for task 02083763-bbaf-bdc3-98b6-0000000008c2 7554 1726853164.07839: no more pending results, returning what we have 7554 1726853164.07847: in VariableManager get_vars() 7554 1726853164.07903: Calling all_inventory to load vars for managed_node3 7554 1726853164.07911: Calling groups_inventory to load vars for managed_node3 7554 1726853164.07913: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853164.07920: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000008c2 7554 1726853164.07957: WORKER PROCESS EXITING 7554 1726853164.07970: Calling all_plugins_play to load vars for managed_node3 7554 1726853164.07975: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853164.07978: Calling groups_plugins_play to load vars for managed_node3 7554 1726853164.10619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853164.14047: done with get_vars() 7554 1726853164.14069: variable 'ansible_search_path' from source: unknown 7554 1726853164.14070: variable 'ansible_search_path' from source: unknown 7554 1726853164.14108: we have included files to process 7554 1726853164.14109: generating all_blocks data 7554 1726853164.14175: done generating all_blocks data 7554 1726853164.14177: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7554 1726853164.14179: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7554 1726853164.14182: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7554 1726853164.14495: done processing included file 7554 1726853164.14497: iterating over new_blocks loaded from include file 7554 1726853164.14499: in VariableManager get_vars() 7554 1726853164.14524: done with get_vars() 7554 1726853164.14526: filtering new block on tags 7554 1726853164.14541: done filtering new block on tags 7554 1726853164.14660: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 7554 1726853164.14667: extending task lists for all hosts with included blocks 7554 1726853164.14888: done extending task lists 7554 1726853164.14889: done processing included files 7554 1726853164.14890: results queue empty 7554 1726853164.14891: checking for any_errors_fatal 7554 1726853164.14894: done checking for any_errors_fatal 7554 1726853164.14894: checking for max_fail_percentage 7554 1726853164.14896: done checking for max_fail_percentage 7554 1726853164.14896: checking to see if all hosts have failed and the running result is not ok 7554 1726853164.14898: done checking to see if all hosts have failed 7554 1726853164.14898: getting the remaining hosts for this loop 7554 1726853164.14899: done getting the remaining hosts for this loop 7554 1726853164.14902: getting the next task for host managed_node3 7554 1726853164.14906: done getting next task for host managed_node3 7554 1726853164.14908: ^ task is: TASK: Get stat for interface {{ interface }} 7554 1726853164.14911: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853164.14913: getting variables 7554 1726853164.14914: in VariableManager get_vars() 7554 1726853164.14932: Calling all_inventory to load vars for managed_node3 7554 1726853164.14935: Calling groups_inventory to load vars for managed_node3 7554 1726853164.14937: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853164.14946: Calling all_plugins_play to load vars for managed_node3 7554 1726853164.14949: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853164.14953: Calling groups_plugins_play to load vars for managed_node3 7554 1726853164.17527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853164.20809: done with get_vars() 7554 1726853164.20952: done getting variables 7554 1726853164.21293: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:26:04 -0400 (0:00:00.166) 0:00:18.180 ****** 7554 1726853164.21322: entering _queue_task() for managed_node3/stat 7554 1726853164.22130: worker is 1 (out of 1 available) 7554 1726853164.22142: exiting _queue_task() for managed_node3/stat 7554 1726853164.22155: done queuing things up, now waiting for results queue to drain 7554 1726853164.22157: waiting for pending results... 7554 1726853164.22554: running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 7554 1726853164.22913: in run() - task 02083763-bbaf-bdc3-98b6-000000000ac6 7554 1726853164.22918: variable 'ansible_search_path' from source: unknown 7554 1726853164.22920: variable 'ansible_search_path' from source: unknown 7554 1726853164.23087: calling self._execute() 7554 1726853164.23174: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853164.23277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853164.23281: variable 'omit' from source: magic vars 7554 1726853164.24024: variable 'ansible_distribution_major_version' from source: facts 7554 1726853164.24037: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853164.24043: variable 'omit' from source: magic vars 7554 1726853164.24215: variable 'omit' from source: magic vars 7554 1726853164.24415: variable 'interface' from source: play vars 7554 1726853164.24434: variable 'omit' from source: magic vars 7554 1726853164.24475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853164.24513: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853164.24639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853164.24657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853164.24669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853164.24774: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853164.24778: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853164.24781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853164.24993: Set connection var ansible_shell_executable to /bin/sh 7554 1726853164.25066: Set connection var ansible_pipelining to False 7554 1726853164.25069: Set connection var ansible_shell_type to sh 7554 1726853164.25074: Set connection var ansible_connection to ssh 7554 1726853164.25077: Set connection var ansible_timeout to 10 7554 1726853164.25080: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853164.25082: variable 'ansible_shell_executable' from source: unknown 7554 1726853164.25085: variable 'ansible_connection' from source: unknown 7554 1726853164.25087: variable 'ansible_module_compression' from source: unknown 7554 1726853164.25089: variable 'ansible_shell_type' from source: unknown 7554 1726853164.25092: variable 'ansible_shell_executable' from source: unknown 7554 1726853164.25094: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853164.25096: variable 'ansible_pipelining' from source: unknown 7554 1726853164.25098: variable 'ansible_timeout' from source: unknown 7554 1726853164.25176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853164.25723: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853164.25734: variable 'omit' from source: magic vars 7554 1726853164.25741: starting attempt loop 7554 1726853164.25746: running the handler 7554 1726853164.25760: _low_level_execute_command(): starting 7554 1726853164.25767: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853164.27247: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853164.27486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853164.29180: stdout chunk (state=3): >>>/root <<< 7554 1726853164.29378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853164.29385: stderr chunk (state=3): >>><<< 7554 1726853164.29388: stdout chunk (state=3): >>><<< 7554 1726853164.29534: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853164.29537: _low_level_execute_command(): starting 7554 1726853164.29541: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853164.2941785-8283-63152290580803 `" && echo ansible-tmp-1726853164.2941785-8283-63152290580803="` echo /root/.ansible/tmp/ansible-tmp-1726853164.2941785-8283-63152290580803 `" ) && sleep 0' 7554 1726853164.30579: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853164.30584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853164.30794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853164.30798: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853164.30816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853164.30876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853164.30893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853164.31039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853164.33132: stdout chunk (state=3): >>>ansible-tmp-1726853164.2941785-8283-63152290580803=/root/.ansible/tmp/ansible-tmp-1726853164.2941785-8283-63152290580803 <<< 7554 1726853164.33136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853164.33139: stderr chunk (state=3): >>><<< 7554 1726853164.33141: stdout chunk (state=3): >>><<< 7554 1726853164.33160: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853164.2941785-8283-63152290580803=/root/.ansible/tmp/ansible-tmp-1726853164.2941785-8283-63152290580803 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853164.33212: variable 'ansible_module_compression' from source: unknown 7554 1726853164.33270: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7554 1726853164.33458: variable 'ansible_facts' from source: unknown 7554 1726853164.33621: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853164.2941785-8283-63152290580803/AnsiballZ_stat.py 7554 1726853164.34079: Sending initial data 7554 1726853164.34082: Sent initial data (150 bytes) 7554 1726853164.35397: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853164.35538: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853164.35621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853164.35778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853164.37415: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853164.37473: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853164.37526: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmptu60sfib /root/.ansible/tmp/ansible-tmp-1726853164.2941785-8283-63152290580803/AnsiballZ_stat.py <<< 7554 1726853164.37534: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853164.2941785-8283-63152290580803/AnsiballZ_stat.py" <<< 7554 1726853164.37584: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmptu60sfib" to remote "/root/.ansible/tmp/ansible-tmp-1726853164.2941785-8283-63152290580803/AnsiballZ_stat.py" <<< 7554 1726853164.37590: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853164.2941785-8283-63152290580803/AnsiballZ_stat.py" <<< 7554 1726853164.38216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853164.38234: stderr chunk (state=3): >>><<< 7554 1726853164.38241: stdout chunk (state=3): >>><<< 7554 1726853164.38265: done transferring module to remote 7554 1726853164.38275: _low_level_execute_command(): starting 7554 1726853164.38280: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853164.2941785-8283-63152290580803/ /root/.ansible/tmp/ansible-tmp-1726853164.2941785-8283-63152290580803/AnsiballZ_stat.py && sleep 0' 7554 1726853164.38952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853164.38965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853164.38986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853164.39075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853164.40930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853164.40961: stderr chunk (state=3): >>><<< 7554 1726853164.40963: stdout chunk (state=3): >>><<< 7554 1726853164.41022: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853164.41025: _low_level_execute_command(): starting 7554 1726853164.41028: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853164.2941785-8283-63152290580803/AnsiballZ_stat.py && sleep 0' 7554 1726853164.41396: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853164.41400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853164.41403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853164.41451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853164.41456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853164.41521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853164.57433: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25123, "dev": 23, "nlink": 1, "atime": 1726853155.267833, "mtime": 1726853155.267833, "ctime": 1726853155.267833, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7554 1726853164.58832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853164.58859: stderr chunk (state=3): >>><<< 7554 1726853164.58862: stdout chunk (state=3): >>><<< 7554 1726853164.58882: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25123, "dev": 23, "nlink": 1, "atime": 1726853155.267833, "mtime": 1726853155.267833, "ctime": 1726853155.267833, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853164.58917: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853164.2941785-8283-63152290580803/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853164.58925: _low_level_execute_command(): starting 7554 1726853164.58930: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853164.2941785-8283-63152290580803/ > /dev/null 2>&1 && sleep 0' 7554 1726853164.59337: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853164.59351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853164.59354: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853164.59375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853164.59435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853164.59442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853164.59443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853164.59491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853164.61376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853164.61395: stderr chunk (state=3): >>><<< 7554 1726853164.61399: stdout chunk (state=3): >>><<< 7554 1726853164.61411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853164.61416: handler run complete 7554 1726853164.61448: attempt loop complete, returning result 7554 1726853164.61451: _execute() done 7554 1726853164.61453: dumping result to json 7554 1726853164.61455: done dumping result, returning 7554 1726853164.61466: done running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 [02083763-bbaf-bdc3-98b6-000000000ac6] 7554 1726853164.61469: sending task result for task 02083763-bbaf-bdc3-98b6-000000000ac6 7554 1726853164.61575: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000ac6 7554 1726853164.61578: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726853155.267833, "block_size": 4096, "blocks": 0, "ctime": 1726853155.267833, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 25123, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1726853155.267833, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 7554 1726853164.61665: no more pending results, returning what we have 7554 1726853164.61669: results queue empty 7554 1726853164.61669: checking for any_errors_fatal 7554 1726853164.61673: done checking for any_errors_fatal 7554 1726853164.61674: checking for max_fail_percentage 7554 1726853164.61675: done checking for max_fail_percentage 7554 1726853164.61676: checking to see if all hosts have failed and the running result is not ok 7554 1726853164.61677: done checking to see if all hosts have failed 7554 1726853164.61678: getting the remaining hosts for this loop 7554 1726853164.61679: done getting the remaining hosts for this loop 7554 1726853164.61683: getting the next task for host managed_node3 7554 1726853164.61691: done getting next task for host managed_node3 7554 1726853164.61693: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 7554 1726853164.61696: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853164.61701: getting variables 7554 1726853164.61702: in VariableManager get_vars() 7554 1726853164.61747: Calling all_inventory to load vars for managed_node3 7554 1726853164.61750: Calling groups_inventory to load vars for managed_node3 7554 1726853164.61752: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853164.61761: Calling all_plugins_play to load vars for managed_node3 7554 1726853164.61763: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853164.61766: Calling groups_plugins_play to load vars for managed_node3 7554 1726853164.62554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853164.63423: done with get_vars() 7554 1726853164.63438: done getting variables 7554 1726853164.63484: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853164.63574: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:26:04 -0400 (0:00:00.422) 0:00:18.603 ****** 7554 1726853164.63595: entering _queue_task() for managed_node3/assert 7554 1726853164.63828: worker is 1 (out of 1 available) 7554 1726853164.63841: exiting _queue_task() for managed_node3/assert 7554 1726853164.63855: done queuing things up, now waiting for results queue to drain 7554 1726853164.63857: waiting for pending results... 7554 1726853164.64034: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' 7554 1726853164.64105: in run() - task 02083763-bbaf-bdc3-98b6-0000000008c3 7554 1726853164.64115: variable 'ansible_search_path' from source: unknown 7554 1726853164.64119: variable 'ansible_search_path' from source: unknown 7554 1726853164.64150: calling self._execute() 7554 1726853164.64228: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853164.64234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853164.64241: variable 'omit' from source: magic vars 7554 1726853164.64507: variable 'ansible_distribution_major_version' from source: facts 7554 1726853164.64521: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853164.64524: variable 'omit' from source: magic vars 7554 1726853164.64553: variable 'omit' from source: magic vars 7554 1726853164.64619: variable 'interface' from source: play vars 7554 1726853164.64636: variable 'omit' from source: magic vars 7554 1726853164.64668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853164.64698: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853164.64715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853164.64728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853164.64739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853164.64765: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853164.64769: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853164.64773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853164.64837: Set connection var ansible_shell_executable to /bin/sh 7554 1726853164.64847: Set connection var ansible_pipelining to False 7554 1726853164.64850: Set connection var ansible_shell_type to sh 7554 1726853164.64852: Set connection var ansible_connection to ssh 7554 1726853164.64862: Set connection var ansible_timeout to 10 7554 1726853164.64865: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853164.64884: variable 'ansible_shell_executable' from source: unknown 7554 1726853164.64887: variable 'ansible_connection' from source: unknown 7554 1726853164.64890: variable 'ansible_module_compression' from source: unknown 7554 1726853164.64892: variable 'ansible_shell_type' from source: unknown 7554 1726853164.64894: variable 'ansible_shell_executable' from source: unknown 7554 1726853164.64896: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853164.64898: variable 'ansible_pipelining' from source: unknown 7554 1726853164.64902: variable 'ansible_timeout' from source: unknown 7554 1726853164.64907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853164.65006: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853164.65016: variable 'omit' from source: magic vars 7554 1726853164.65020: starting attempt loop 7554 1726853164.65024: running the handler 7554 1726853164.65116: variable 'interface_stat' from source: set_fact 7554 1726853164.65130: Evaluated conditional (interface_stat.stat.exists): True 7554 1726853164.65136: handler run complete 7554 1726853164.65148: attempt loop complete, returning result 7554 1726853164.65151: _execute() done 7554 1726853164.65154: dumping result to json 7554 1726853164.65156: done dumping result, returning 7554 1726853164.65162: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' [02083763-bbaf-bdc3-98b6-0000000008c3] 7554 1726853164.65167: sending task result for task 02083763-bbaf-bdc3-98b6-0000000008c3 7554 1726853164.65249: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000008c3 7554 1726853164.65252: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7554 1726853164.65330: no more pending results, returning what we have 7554 1726853164.65333: results queue empty 7554 1726853164.65334: checking for any_errors_fatal 7554 1726853164.65340: done checking for any_errors_fatal 7554 1726853164.65341: checking for max_fail_percentage 7554 1726853164.65342: done checking for max_fail_percentage 7554 1726853164.65343: checking to see if all hosts have failed and the running result is not ok 7554 1726853164.65346: done checking to see if all hosts have failed 7554 1726853164.65347: getting the remaining hosts for this loop 7554 1726853164.65348: done getting the remaining hosts for this loop 7554 1726853164.65352: getting the next task for host managed_node3 7554 1726853164.65358: done getting next task for host managed_node3 7554 1726853164.65361: ^ task is: TASK: Include the task 'assert_profile_present.yml' 7554 1726853164.65363: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853164.65367: getting variables 7554 1726853164.65368: in VariableManager get_vars() 7554 1726853164.65409: Calling all_inventory to load vars for managed_node3 7554 1726853164.65418: Calling groups_inventory to load vars for managed_node3 7554 1726853164.65422: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853164.65430: Calling all_plugins_play to load vars for managed_node3 7554 1726853164.65432: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853164.65435: Calling groups_plugins_play to load vars for managed_node3 7554 1726853164.66297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853164.67142: done with get_vars() 7554 1726853164.67158: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:44 Friday 20 September 2024 13:26:04 -0400 (0:00:00.036) 0:00:18.639 ****** 7554 1726853164.67224: entering _queue_task() for managed_node3/include_tasks 7554 1726853164.67454: worker is 1 (out of 1 available) 7554 1726853164.67473: exiting _queue_task() for managed_node3/include_tasks 7554 1726853164.67483: done queuing things up, now waiting for results queue to drain 7554 1726853164.67485: waiting for pending results... 7554 1726853164.67653: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 7554 1726853164.67715: in run() - task 02083763-bbaf-bdc3-98b6-00000000005c 7554 1726853164.67727: variable 'ansible_search_path' from source: unknown 7554 1726853164.67756: calling self._execute() 7554 1726853164.67834: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853164.67840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853164.67849: variable 'omit' from source: magic vars 7554 1726853164.68116: variable 'ansible_distribution_major_version' from source: facts 7554 1726853164.68125: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853164.68130: _execute() done 7554 1726853164.68133: dumping result to json 7554 1726853164.68136: done dumping result, returning 7554 1726853164.68146: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [02083763-bbaf-bdc3-98b6-00000000005c] 7554 1726853164.68149: sending task result for task 02083763-bbaf-bdc3-98b6-00000000005c 7554 1726853164.68238: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000005c 7554 1726853164.68242: WORKER PROCESS EXITING 7554 1726853164.68291: no more pending results, returning what we have 7554 1726853164.68296: in VariableManager get_vars() 7554 1726853164.68342: Calling all_inventory to load vars for managed_node3 7554 1726853164.68347: Calling groups_inventory to load vars for managed_node3 7554 1726853164.68349: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853164.68359: Calling all_plugins_play to load vars for managed_node3 7554 1726853164.68361: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853164.68364: Calling groups_plugins_play to load vars for managed_node3 7554 1726853164.69121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853164.69988: done with get_vars() 7554 1726853164.70002: variable 'ansible_search_path' from source: unknown 7554 1726853164.70013: we have included files to process 7554 1726853164.70014: generating all_blocks data 7554 1726853164.70015: done generating all_blocks data 7554 1726853164.70017: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7554 1726853164.70018: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7554 1726853164.70019: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7554 1726853164.70154: in VariableManager get_vars() 7554 1726853164.70177: done with get_vars() 7554 1726853164.70347: done processing included file 7554 1726853164.70349: iterating over new_blocks loaded from include file 7554 1726853164.70350: in VariableManager get_vars() 7554 1726853164.70364: done with get_vars() 7554 1726853164.70366: filtering new block on tags 7554 1726853164.70380: done filtering new block on tags 7554 1726853164.70382: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 7554 1726853164.70387: extending task lists for all hosts with included blocks 7554 1726853164.72931: done extending task lists 7554 1726853164.72933: done processing included files 7554 1726853164.72933: results queue empty 7554 1726853164.72934: checking for any_errors_fatal 7554 1726853164.72936: done checking for any_errors_fatal 7554 1726853164.72936: checking for max_fail_percentage 7554 1726853164.72937: done checking for max_fail_percentage 7554 1726853164.72937: checking to see if all hosts have failed and the running result is not ok 7554 1726853164.72938: done checking to see if all hosts have failed 7554 1726853164.72939: getting the remaining hosts for this loop 7554 1726853164.72939: done getting the remaining hosts for this loop 7554 1726853164.72941: getting the next task for host managed_node3 7554 1726853164.72946: done getting next task for host managed_node3 7554 1726853164.72947: ^ task is: TASK: Include the task 'get_profile_stat.yml' 7554 1726853164.72949: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853164.72951: getting variables 7554 1726853164.72951: in VariableManager get_vars() 7554 1726853164.72966: Calling all_inventory to load vars for managed_node3 7554 1726853164.72967: Calling groups_inventory to load vars for managed_node3 7554 1726853164.72968: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853164.72975: Calling all_plugins_play to load vars for managed_node3 7554 1726853164.72977: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853164.72978: Calling groups_plugins_play to load vars for managed_node3 7554 1726853164.73626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853164.74488: done with get_vars() 7554 1726853164.74506: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:26:04 -0400 (0:00:00.073) 0:00:18.713 ****** 7554 1726853164.74562: entering _queue_task() for managed_node3/include_tasks 7554 1726853164.74823: worker is 1 (out of 1 available) 7554 1726853164.74840: exiting _queue_task() for managed_node3/include_tasks 7554 1726853164.74855: done queuing things up, now waiting for results queue to drain 7554 1726853164.74857: waiting for pending results... 7554 1726853164.75038: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 7554 1726853164.75111: in run() - task 02083763-bbaf-bdc3-98b6-000000000ade 7554 1726853164.75121: variable 'ansible_search_path' from source: unknown 7554 1726853164.75125: variable 'ansible_search_path' from source: unknown 7554 1726853164.75156: calling self._execute() 7554 1726853164.75233: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853164.75238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853164.75250: variable 'omit' from source: magic vars 7554 1726853164.75525: variable 'ansible_distribution_major_version' from source: facts 7554 1726853164.75534: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853164.75540: _execute() done 7554 1726853164.75543: dumping result to json 7554 1726853164.75549: done dumping result, returning 7554 1726853164.75552: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-bdc3-98b6-000000000ade] 7554 1726853164.75558: sending task result for task 02083763-bbaf-bdc3-98b6-000000000ade 7554 1726853164.75640: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000ade 7554 1726853164.75643: WORKER PROCESS EXITING 7554 1726853164.75674: no more pending results, returning what we have 7554 1726853164.75679: in VariableManager get_vars() 7554 1726853164.75731: Calling all_inventory to load vars for managed_node3 7554 1726853164.75734: Calling groups_inventory to load vars for managed_node3 7554 1726853164.75736: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853164.75750: Calling all_plugins_play to load vars for managed_node3 7554 1726853164.75752: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853164.75755: Calling groups_plugins_play to load vars for managed_node3 7554 1726853164.76641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853164.77495: done with get_vars() 7554 1726853164.77509: variable 'ansible_search_path' from source: unknown 7554 1726853164.77511: variable 'ansible_search_path' from source: unknown 7554 1726853164.77533: we have included files to process 7554 1726853164.77534: generating all_blocks data 7554 1726853164.77535: done generating all_blocks data 7554 1726853164.77536: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7554 1726853164.77537: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7554 1726853164.77538: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7554 1726853164.78207: done processing included file 7554 1726853164.78208: iterating over new_blocks loaded from include file 7554 1726853164.78209: in VariableManager get_vars() 7554 1726853164.78225: done with get_vars() 7554 1726853164.78226: filtering new block on tags 7554 1726853164.78239: done filtering new block on tags 7554 1726853164.78241: in VariableManager get_vars() 7554 1726853164.78255: done with get_vars() 7554 1726853164.78256: filtering new block on tags 7554 1726853164.78268: done filtering new block on tags 7554 1726853164.78270: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 7554 1726853164.78276: extending task lists for all hosts with included blocks 7554 1726853164.78366: done extending task lists 7554 1726853164.78367: done processing included files 7554 1726853164.78367: results queue empty 7554 1726853164.78368: checking for any_errors_fatal 7554 1726853164.78370: done checking for any_errors_fatal 7554 1726853164.78370: checking for max_fail_percentage 7554 1726853164.78373: done checking for max_fail_percentage 7554 1726853164.78373: checking to see if all hosts have failed and the running result is not ok 7554 1726853164.78374: done checking to see if all hosts have failed 7554 1726853164.78375: getting the remaining hosts for this loop 7554 1726853164.78376: done getting the remaining hosts for this loop 7554 1726853164.78377: getting the next task for host managed_node3 7554 1726853164.78380: done getting next task for host managed_node3 7554 1726853164.78382: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 7554 1726853164.78384: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853164.78386: getting variables 7554 1726853164.78386: in VariableManager get_vars() 7554 1726853164.78426: Calling all_inventory to load vars for managed_node3 7554 1726853164.78427: Calling groups_inventory to load vars for managed_node3 7554 1726853164.78429: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853164.78432: Calling all_plugins_play to load vars for managed_node3 7554 1726853164.78434: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853164.78435: Calling groups_plugins_play to load vars for managed_node3 7554 1726853164.79034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853164.79921: done with get_vars() 7554 1726853164.79934: done getting variables 7554 1726853164.79964: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:26:04 -0400 (0:00:00.054) 0:00:18.767 ****** 7554 1726853164.79985: entering _queue_task() for managed_node3/set_fact 7554 1726853164.80237: worker is 1 (out of 1 available) 7554 1726853164.80255: exiting _queue_task() for managed_node3/set_fact 7554 1726853164.80268: done queuing things up, now waiting for results queue to drain 7554 1726853164.80270: waiting for pending results... 7554 1726853164.80710: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 7554 1726853164.80807: in run() - task 02083763-bbaf-bdc3-98b6-000000000cef 7554 1726853164.80811: variable 'ansible_search_path' from source: unknown 7554 1726853164.80815: variable 'ansible_search_path' from source: unknown 7554 1726853164.80880: calling self._execute() 7554 1726853164.80939: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853164.80948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853164.80955: variable 'omit' from source: magic vars 7554 1726853164.81353: variable 'ansible_distribution_major_version' from source: facts 7554 1726853164.81368: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853164.81375: variable 'omit' from source: magic vars 7554 1726853164.81429: variable 'omit' from source: magic vars 7554 1726853164.81464: variable 'omit' from source: magic vars 7554 1726853164.81505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853164.81550: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853164.81573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853164.81591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853164.81604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853164.81642: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853164.81649: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853164.81652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853164.81786: Set connection var ansible_shell_executable to /bin/sh 7554 1726853164.81790: Set connection var ansible_pipelining to False 7554 1726853164.81792: Set connection var ansible_shell_type to sh 7554 1726853164.81794: Set connection var ansible_connection to ssh 7554 1726853164.81797: Set connection var ansible_timeout to 10 7554 1726853164.81799: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853164.81809: variable 'ansible_shell_executable' from source: unknown 7554 1726853164.81811: variable 'ansible_connection' from source: unknown 7554 1726853164.81815: variable 'ansible_module_compression' from source: unknown 7554 1726853164.81817: variable 'ansible_shell_type' from source: unknown 7554 1726853164.81820: variable 'ansible_shell_executable' from source: unknown 7554 1726853164.81823: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853164.81876: variable 'ansible_pipelining' from source: unknown 7554 1726853164.81880: variable 'ansible_timeout' from source: unknown 7554 1726853164.81883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853164.81984: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853164.81999: variable 'omit' from source: magic vars 7554 1726853164.82005: starting attempt loop 7554 1726853164.82009: running the handler 7554 1726853164.82054: handler run complete 7554 1726853164.82057: attempt loop complete, returning result 7554 1726853164.82060: _execute() done 7554 1726853164.82062: dumping result to json 7554 1726853164.82065: done dumping result, returning 7554 1726853164.82068: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-bdc3-98b6-000000000cef] 7554 1726853164.82072: sending task result for task 02083763-bbaf-bdc3-98b6-000000000cef 7554 1726853164.82294: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000cef 7554 1726853164.82298: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 7554 1726853164.82377: no more pending results, returning what we have 7554 1726853164.82380: results queue empty 7554 1726853164.82381: checking for any_errors_fatal 7554 1726853164.82382: done checking for any_errors_fatal 7554 1726853164.82383: checking for max_fail_percentage 7554 1726853164.82384: done checking for max_fail_percentage 7554 1726853164.82385: checking to see if all hosts have failed and the running result is not ok 7554 1726853164.82386: done checking to see if all hosts have failed 7554 1726853164.82387: getting the remaining hosts for this loop 7554 1726853164.82388: done getting the remaining hosts for this loop 7554 1726853164.82391: getting the next task for host managed_node3 7554 1726853164.82396: done getting next task for host managed_node3 7554 1726853164.82399: ^ task is: TASK: Stat profile file 7554 1726853164.82403: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853164.82406: getting variables 7554 1726853164.82407: in VariableManager get_vars() 7554 1726853164.82450: Calling all_inventory to load vars for managed_node3 7554 1726853164.82453: Calling groups_inventory to load vars for managed_node3 7554 1726853164.82455: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853164.82463: Calling all_plugins_play to load vars for managed_node3 7554 1726853164.82466: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853164.82468: Calling groups_plugins_play to load vars for managed_node3 7554 1726853164.83737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853164.85283: done with get_vars() 7554 1726853164.85299: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:26:04 -0400 (0:00:00.053) 0:00:18.821 ****** 7554 1726853164.85365: entering _queue_task() for managed_node3/stat 7554 1726853164.85589: worker is 1 (out of 1 available) 7554 1726853164.85602: exiting _queue_task() for managed_node3/stat 7554 1726853164.85614: done queuing things up, now waiting for results queue to drain 7554 1726853164.85616: waiting for pending results... 7554 1726853164.85788: running TaskExecutor() for managed_node3/TASK: Stat profile file 7554 1726853164.85857: in run() - task 02083763-bbaf-bdc3-98b6-000000000cf0 7554 1726853164.85869: variable 'ansible_search_path' from source: unknown 7554 1726853164.85874: variable 'ansible_search_path' from source: unknown 7554 1726853164.85900: calling self._execute() 7554 1726853164.85972: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853164.85975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853164.85986: variable 'omit' from source: magic vars 7554 1726853164.86248: variable 'ansible_distribution_major_version' from source: facts 7554 1726853164.86256: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853164.86262: variable 'omit' from source: magic vars 7554 1726853164.86292: variable 'omit' from source: magic vars 7554 1726853164.86360: variable 'profile' from source: include params 7554 1726853164.86364: variable 'interface' from source: play vars 7554 1726853164.86420: variable 'interface' from source: play vars 7554 1726853164.86432: variable 'omit' from source: magic vars 7554 1726853164.86465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853164.86493: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853164.86511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853164.86527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853164.86536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853164.86559: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853164.86562: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853164.86565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853164.86637: Set connection var ansible_shell_executable to /bin/sh 7554 1726853164.86640: Set connection var ansible_pipelining to False 7554 1726853164.86643: Set connection var ansible_shell_type to sh 7554 1726853164.86647: Set connection var ansible_connection to ssh 7554 1726853164.86655: Set connection var ansible_timeout to 10 7554 1726853164.86660: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853164.86679: variable 'ansible_shell_executable' from source: unknown 7554 1726853164.86682: variable 'ansible_connection' from source: unknown 7554 1726853164.86685: variable 'ansible_module_compression' from source: unknown 7554 1726853164.86687: variable 'ansible_shell_type' from source: unknown 7554 1726853164.86690: variable 'ansible_shell_executable' from source: unknown 7554 1726853164.86692: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853164.86696: variable 'ansible_pipelining' from source: unknown 7554 1726853164.86699: variable 'ansible_timeout' from source: unknown 7554 1726853164.86704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853164.86849: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853164.86853: variable 'omit' from source: magic vars 7554 1726853164.86859: starting attempt loop 7554 1726853164.86862: running the handler 7554 1726853164.86876: _low_level_execute_command(): starting 7554 1726853164.86883: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853164.87377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853164.87381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853164.87385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853164.87441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853164.87446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853164.87451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853164.87524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853164.89230: stdout chunk (state=3): >>>/root <<< 7554 1726853164.89331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853164.89360: stderr chunk (state=3): >>><<< 7554 1726853164.89363: stdout chunk (state=3): >>><<< 7554 1726853164.89386: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853164.89398: _low_level_execute_command(): starting 7554 1726853164.89404: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853164.8938684-8313-46543971932433 `" && echo ansible-tmp-1726853164.8938684-8313-46543971932433="` echo /root/.ansible/tmp/ansible-tmp-1726853164.8938684-8313-46543971932433 `" ) && sleep 0' 7554 1726853164.89835: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853164.89843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853164.89875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853164.89886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853164.89889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853164.89891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853164.89936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853164.89939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853164.89946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853164.90009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853164.91982: stdout chunk (state=3): >>>ansible-tmp-1726853164.8938684-8313-46543971932433=/root/.ansible/tmp/ansible-tmp-1726853164.8938684-8313-46543971932433 <<< 7554 1726853164.92088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853164.92116: stderr chunk (state=3): >>><<< 7554 1726853164.92119: stdout chunk (state=3): >>><<< 7554 1726853164.92135: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853164.8938684-8313-46543971932433=/root/.ansible/tmp/ansible-tmp-1726853164.8938684-8313-46543971932433 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853164.92183: variable 'ansible_module_compression' from source: unknown 7554 1726853164.92227: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7554 1726853164.92261: variable 'ansible_facts' from source: unknown 7554 1726853164.92328: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853164.8938684-8313-46543971932433/AnsiballZ_stat.py 7554 1726853164.92430: Sending initial data 7554 1726853164.92434: Sent initial data (150 bytes) 7554 1726853164.92888: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853164.92891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853164.92893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853164.92896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853164.92899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853164.92951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853164.92954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853164.92960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853164.93021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853164.94666: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853164.94719: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853164.94779: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpwl1k23nb /root/.ansible/tmp/ansible-tmp-1726853164.8938684-8313-46543971932433/AnsiballZ_stat.py <<< 7554 1726853164.94784: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853164.8938684-8313-46543971932433/AnsiballZ_stat.py" <<< 7554 1726853164.94834: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpwl1k23nb" to remote "/root/.ansible/tmp/ansible-tmp-1726853164.8938684-8313-46543971932433/AnsiballZ_stat.py" <<< 7554 1726853164.94843: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853164.8938684-8313-46543971932433/AnsiballZ_stat.py" <<< 7554 1726853164.95443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853164.95488: stderr chunk (state=3): >>><<< 7554 1726853164.95491: stdout chunk (state=3): >>><<< 7554 1726853164.95530: done transferring module to remote 7554 1726853164.95539: _low_level_execute_command(): starting 7554 1726853164.95546: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853164.8938684-8313-46543971932433/ /root/.ansible/tmp/ansible-tmp-1726853164.8938684-8313-46543971932433/AnsiballZ_stat.py && sleep 0' 7554 1726853164.95997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853164.96000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853164.96003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853164.96005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853164.96012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853164.96063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853164.96070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853164.96128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853164.97977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853164.98005: stderr chunk (state=3): >>><<< 7554 1726853164.98008: stdout chunk (state=3): >>><<< 7554 1726853164.98021: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853164.98024: _low_level_execute_command(): starting 7554 1726853164.98029: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853164.8938684-8313-46543971932433/AnsiballZ_stat.py && sleep 0' 7554 1726853164.98473: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853164.98477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853164.98479: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853164.98482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853164.98484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853164.98534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853164.98544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853164.98548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853164.98602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853165.14319: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7554 1726853165.15754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853165.15784: stderr chunk (state=3): >>><<< 7554 1726853165.15787: stdout chunk (state=3): >>><<< 7554 1726853165.15802: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853165.15827: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853164.8938684-8313-46543971932433/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853165.15836: _low_level_execute_command(): starting 7554 1726853165.15839: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853164.8938684-8313-46543971932433/ > /dev/null 2>&1 && sleep 0' 7554 1726853165.16308: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853165.16311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853165.16314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853165.16316: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853165.16322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853165.16383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853165.16387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853165.16390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853165.16447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853165.18347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853165.18373: stderr chunk (state=3): >>><<< 7554 1726853165.18376: stdout chunk (state=3): >>><<< 7554 1726853165.18391: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853165.18398: handler run complete 7554 1726853165.18416: attempt loop complete, returning result 7554 1726853165.18419: _execute() done 7554 1726853165.18423: dumping result to json 7554 1726853165.18425: done dumping result, returning 7554 1726853165.18432: done running TaskExecutor() for managed_node3/TASK: Stat profile file [02083763-bbaf-bdc3-98b6-000000000cf0] 7554 1726853165.18438: sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf0 7554 1726853165.18533: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf0 7554 1726853165.18536: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 7554 1726853165.18595: no more pending results, returning what we have 7554 1726853165.18598: results queue empty 7554 1726853165.18599: checking for any_errors_fatal 7554 1726853165.18605: done checking for any_errors_fatal 7554 1726853165.18605: checking for max_fail_percentage 7554 1726853165.18607: done checking for max_fail_percentage 7554 1726853165.18607: checking to see if all hosts have failed and the running result is not ok 7554 1726853165.18609: done checking to see if all hosts have failed 7554 1726853165.18609: getting the remaining hosts for this loop 7554 1726853165.18610: done getting the remaining hosts for this loop 7554 1726853165.18614: getting the next task for host managed_node3 7554 1726853165.18621: done getting next task for host managed_node3 7554 1726853165.18623: ^ task is: TASK: Set NM profile exist flag based on the profile files 7554 1726853165.18627: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853165.18632: getting variables 7554 1726853165.18633: in VariableManager get_vars() 7554 1726853165.18685: Calling all_inventory to load vars for managed_node3 7554 1726853165.18688: Calling groups_inventory to load vars for managed_node3 7554 1726853165.18690: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853165.18700: Calling all_plugins_play to load vars for managed_node3 7554 1726853165.18703: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853165.18705: Calling groups_plugins_play to load vars for managed_node3 7554 1726853165.19612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853165.20454: done with get_vars() 7554 1726853165.20469: done getting variables 7554 1726853165.20517: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:26:05 -0400 (0:00:00.351) 0:00:19.172 ****** 7554 1726853165.20539: entering _queue_task() for managed_node3/set_fact 7554 1726853165.20792: worker is 1 (out of 1 available) 7554 1726853165.20805: exiting _queue_task() for managed_node3/set_fact 7554 1726853165.20818: done queuing things up, now waiting for results queue to drain 7554 1726853165.20820: waiting for pending results... 7554 1726853165.21006: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 7554 1726853165.21089: in run() - task 02083763-bbaf-bdc3-98b6-000000000cf1 7554 1726853165.21100: variable 'ansible_search_path' from source: unknown 7554 1726853165.21103: variable 'ansible_search_path' from source: unknown 7554 1726853165.21131: calling self._execute() 7554 1726853165.21212: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.21216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.21224: variable 'omit' from source: magic vars 7554 1726853165.21505: variable 'ansible_distribution_major_version' from source: facts 7554 1726853165.21515: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853165.21603: variable 'profile_stat' from source: set_fact 7554 1726853165.21613: Evaluated conditional (profile_stat.stat.exists): False 7554 1726853165.21616: when evaluation is False, skipping this task 7554 1726853165.21619: _execute() done 7554 1726853165.21622: dumping result to json 7554 1726853165.21624: done dumping result, returning 7554 1726853165.21630: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-bdc3-98b6-000000000cf1] 7554 1726853165.21636: sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf1 7554 1726853165.21719: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf1 7554 1726853165.21722: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7554 1726853165.21765: no more pending results, returning what we have 7554 1726853165.21768: results queue empty 7554 1726853165.21769: checking for any_errors_fatal 7554 1726853165.21779: done checking for any_errors_fatal 7554 1726853165.21779: checking for max_fail_percentage 7554 1726853165.21781: done checking for max_fail_percentage 7554 1726853165.21782: checking to see if all hosts have failed and the running result is not ok 7554 1726853165.21783: done checking to see if all hosts have failed 7554 1726853165.21784: getting the remaining hosts for this loop 7554 1726853165.21785: done getting the remaining hosts for this loop 7554 1726853165.21788: getting the next task for host managed_node3 7554 1726853165.21794: done getting next task for host managed_node3 7554 1726853165.21796: ^ task is: TASK: Get NM profile info 7554 1726853165.21800: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853165.21805: getting variables 7554 1726853165.21807: in VariableManager get_vars() 7554 1726853165.21853: Calling all_inventory to load vars for managed_node3 7554 1726853165.21855: Calling groups_inventory to load vars for managed_node3 7554 1726853165.21858: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853165.21869: Calling all_plugins_play to load vars for managed_node3 7554 1726853165.21878: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853165.21882: Calling groups_plugins_play to load vars for managed_node3 7554 1726853165.22654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853165.23490: done with get_vars() 7554 1726853165.23509: done getting variables 7554 1726853165.23580: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:26:05 -0400 (0:00:00.030) 0:00:19.203 ****** 7554 1726853165.23601: entering _queue_task() for managed_node3/shell 7554 1726853165.23602: Creating lock for shell 7554 1726853165.23854: worker is 1 (out of 1 available) 7554 1726853165.23867: exiting _queue_task() for managed_node3/shell 7554 1726853165.23880: done queuing things up, now waiting for results queue to drain 7554 1726853165.23882: waiting for pending results... 7554 1726853165.24061: running TaskExecutor() for managed_node3/TASK: Get NM profile info 7554 1726853165.24135: in run() - task 02083763-bbaf-bdc3-98b6-000000000cf2 7554 1726853165.24147: variable 'ansible_search_path' from source: unknown 7554 1726853165.24152: variable 'ansible_search_path' from source: unknown 7554 1726853165.24182: calling self._execute() 7554 1726853165.24257: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.24260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.24269: variable 'omit' from source: magic vars 7554 1726853165.24544: variable 'ansible_distribution_major_version' from source: facts 7554 1726853165.24556: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853165.24562: variable 'omit' from source: magic vars 7554 1726853165.24593: variable 'omit' from source: magic vars 7554 1726853165.24666: variable 'profile' from source: include params 7554 1726853165.24670: variable 'interface' from source: play vars 7554 1726853165.24721: variable 'interface' from source: play vars 7554 1726853165.24736: variable 'omit' from source: magic vars 7554 1726853165.24775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853165.24804: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853165.24820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853165.24833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853165.24843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853165.24869: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853165.24875: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.24878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.24947: Set connection var ansible_shell_executable to /bin/sh 7554 1726853165.24956: Set connection var ansible_pipelining to False 7554 1726853165.24958: Set connection var ansible_shell_type to sh 7554 1726853165.24961: Set connection var ansible_connection to ssh 7554 1726853165.24968: Set connection var ansible_timeout to 10 7554 1726853165.24976: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853165.24995: variable 'ansible_shell_executable' from source: unknown 7554 1726853165.24998: variable 'ansible_connection' from source: unknown 7554 1726853165.25000: variable 'ansible_module_compression' from source: unknown 7554 1726853165.25002: variable 'ansible_shell_type' from source: unknown 7554 1726853165.25005: variable 'ansible_shell_executable' from source: unknown 7554 1726853165.25007: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.25009: variable 'ansible_pipelining' from source: unknown 7554 1726853165.25011: variable 'ansible_timeout' from source: unknown 7554 1726853165.25013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.25120: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853165.25128: variable 'omit' from source: magic vars 7554 1726853165.25132: starting attempt loop 7554 1726853165.25135: running the handler 7554 1726853165.25144: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853165.25162: _low_level_execute_command(): starting 7554 1726853165.25169: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853165.25666: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853165.25697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853165.25701: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853165.25703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853165.25757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853165.25761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853165.25765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853165.25828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853165.27564: stdout chunk (state=3): >>>/root <<< 7554 1726853165.27662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853165.27694: stderr chunk (state=3): >>><<< 7554 1726853165.27697: stdout chunk (state=3): >>><<< 7554 1726853165.27718: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853165.27731: _low_level_execute_command(): starting 7554 1726853165.27737: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853165.2771728-8322-14813134522607 `" && echo ansible-tmp-1726853165.2771728-8322-14813134522607="` echo /root/.ansible/tmp/ansible-tmp-1726853165.2771728-8322-14813134522607 `" ) && sleep 0' 7554 1726853165.28164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853165.28197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853165.28200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853165.28204: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853165.28206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853165.28256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853165.28259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853165.28262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853165.28328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853165.30337: stdout chunk (state=3): >>>ansible-tmp-1726853165.2771728-8322-14813134522607=/root/.ansible/tmp/ansible-tmp-1726853165.2771728-8322-14813134522607 <<< 7554 1726853165.30448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853165.30473: stderr chunk (state=3): >>><<< 7554 1726853165.30476: stdout chunk (state=3): >>><<< 7554 1726853165.30490: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853165.2771728-8322-14813134522607=/root/.ansible/tmp/ansible-tmp-1726853165.2771728-8322-14813134522607 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853165.30521: variable 'ansible_module_compression' from source: unknown 7554 1726853165.30558: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853165.30590: variable 'ansible_facts' from source: unknown 7554 1726853165.30645: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853165.2771728-8322-14813134522607/AnsiballZ_command.py 7554 1726853165.30751: Sending initial data 7554 1726853165.30754: Sent initial data (153 bytes) 7554 1726853165.31165: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853165.31175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853165.31203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853165.31206: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853165.31208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853165.31263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853165.31269: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853165.31274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853165.31329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853165.32978: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7554 1726853165.32982: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853165.33036: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853165.33096: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmplwbuwjat /root/.ansible/tmp/ansible-tmp-1726853165.2771728-8322-14813134522607/AnsiballZ_command.py <<< 7554 1726853165.33102: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853165.2771728-8322-14813134522607/AnsiballZ_command.py" <<< 7554 1726853165.33155: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmplwbuwjat" to remote "/root/.ansible/tmp/ansible-tmp-1726853165.2771728-8322-14813134522607/AnsiballZ_command.py" <<< 7554 1726853165.33159: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853165.2771728-8322-14813134522607/AnsiballZ_command.py" <<< 7554 1726853165.33742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853165.33785: stderr chunk (state=3): >>><<< 7554 1726853165.33788: stdout chunk (state=3): >>><<< 7554 1726853165.33809: done transferring module to remote 7554 1726853165.33818: _low_level_execute_command(): starting 7554 1726853165.33822: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853165.2771728-8322-14813134522607/ /root/.ansible/tmp/ansible-tmp-1726853165.2771728-8322-14813134522607/AnsiballZ_command.py && sleep 0' 7554 1726853165.34233: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853165.34242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853165.34268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853165.34274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853165.34277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853165.34330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853165.34333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853165.34336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853165.34404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853165.36296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853165.36320: stderr chunk (state=3): >>><<< 7554 1726853165.36324: stdout chunk (state=3): >>><<< 7554 1726853165.36340: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853165.36346: _low_level_execute_command(): starting 7554 1726853165.36348: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853165.2771728-8322-14813134522607/AnsiballZ_command.py && sleep 0' 7554 1726853165.36804: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853165.36808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853165.36810: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853165.36812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853165.36860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853165.36863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853165.36865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853165.36938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853165.67995: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 13:26:05.526032", "end": "2024-09-20 13:26:05.678168", "delta": "0:00:00.152136", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853165.69683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853165.69711: stderr chunk (state=3): >>><<< 7554 1726853165.69714: stdout chunk (state=3): >>><<< 7554 1726853165.69730: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 13:26:05.526032", "end": "2024-09-20 13:26:05.678168", "delta": "0:00:00.152136", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853165.69763: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853165.2771728-8322-14813134522607/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853165.69774: _low_level_execute_command(): starting 7554 1726853165.69777: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853165.2771728-8322-14813134522607/ > /dev/null 2>&1 && sleep 0' 7554 1726853165.70233: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853165.70236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853165.70239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853165.70241: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853165.70245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853165.70300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853165.70305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853165.70311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853165.70366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853165.72240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853165.72265: stderr chunk (state=3): >>><<< 7554 1726853165.72268: stdout chunk (state=3): >>><<< 7554 1726853165.72286: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853165.72294: handler run complete 7554 1726853165.72310: Evaluated conditional (False): False 7554 1726853165.72318: attempt loop complete, returning result 7554 1726853165.72320: _execute() done 7554 1726853165.72323: dumping result to json 7554 1726853165.72327: done dumping result, returning 7554 1726853165.72334: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [02083763-bbaf-bdc3-98b6-000000000cf2] 7554 1726853165.72339: sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf2 7554 1726853165.72437: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf2 7554 1726853165.72439: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.152136", "end": "2024-09-20 13:26:05.678168", "rc": 0, "start": "2024-09-20 13:26:05.526032" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 7554 1726853165.72514: no more pending results, returning what we have 7554 1726853165.72517: results queue empty 7554 1726853165.72518: checking for any_errors_fatal 7554 1726853165.72524: done checking for any_errors_fatal 7554 1726853165.72524: checking for max_fail_percentage 7554 1726853165.72526: done checking for max_fail_percentage 7554 1726853165.72527: checking to see if all hosts have failed and the running result is not ok 7554 1726853165.72528: done checking to see if all hosts have failed 7554 1726853165.72528: getting the remaining hosts for this loop 7554 1726853165.72530: done getting the remaining hosts for this loop 7554 1726853165.72534: getting the next task for host managed_node3 7554 1726853165.72540: done getting next task for host managed_node3 7554 1726853165.72543: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7554 1726853165.72546: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853165.72557: getting variables 7554 1726853165.72559: in VariableManager get_vars() 7554 1726853165.72605: Calling all_inventory to load vars for managed_node3 7554 1726853165.72607: Calling groups_inventory to load vars for managed_node3 7554 1726853165.72609: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853165.72620: Calling all_plugins_play to load vars for managed_node3 7554 1726853165.72622: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853165.72625: Calling groups_plugins_play to load vars for managed_node3 7554 1726853165.73526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853165.74363: done with get_vars() 7554 1726853165.74380: done getting variables 7554 1726853165.74424: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:26:05 -0400 (0:00:00.508) 0:00:19.712 ****** 7554 1726853165.74446: entering _queue_task() for managed_node3/set_fact 7554 1726853165.74668: worker is 1 (out of 1 available) 7554 1726853165.74684: exiting _queue_task() for managed_node3/set_fact 7554 1726853165.74698: done queuing things up, now waiting for results queue to drain 7554 1726853165.74699: waiting for pending results... 7554 1726853165.74874: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7554 1726853165.74957: in run() - task 02083763-bbaf-bdc3-98b6-000000000cf3 7554 1726853165.74969: variable 'ansible_search_path' from source: unknown 7554 1726853165.74974: variable 'ansible_search_path' from source: unknown 7554 1726853165.75002: calling self._execute() 7554 1726853165.75077: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.75083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.75091: variable 'omit' from source: magic vars 7554 1726853165.75359: variable 'ansible_distribution_major_version' from source: facts 7554 1726853165.75373: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853165.75459: variable 'nm_profile_exists' from source: set_fact 7554 1726853165.75478: Evaluated conditional (nm_profile_exists.rc == 0): True 7554 1726853165.75481: variable 'omit' from source: magic vars 7554 1726853165.75508: variable 'omit' from source: magic vars 7554 1726853165.75529: variable 'omit' from source: magic vars 7554 1726853165.75562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853165.75592: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853165.75609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853165.75623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853165.75632: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853165.75659: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853165.75662: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.75664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.75734: Set connection var ansible_shell_executable to /bin/sh 7554 1726853165.75740: Set connection var ansible_pipelining to False 7554 1726853165.75743: Set connection var ansible_shell_type to sh 7554 1726853165.75748: Set connection var ansible_connection to ssh 7554 1726853165.75756: Set connection var ansible_timeout to 10 7554 1726853165.75761: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853165.75779: variable 'ansible_shell_executable' from source: unknown 7554 1726853165.75782: variable 'ansible_connection' from source: unknown 7554 1726853165.75785: variable 'ansible_module_compression' from source: unknown 7554 1726853165.75787: variable 'ansible_shell_type' from source: unknown 7554 1726853165.75789: variable 'ansible_shell_executable' from source: unknown 7554 1726853165.75791: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.75797: variable 'ansible_pipelining' from source: unknown 7554 1726853165.75799: variable 'ansible_timeout' from source: unknown 7554 1726853165.75802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.75902: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853165.75912: variable 'omit' from source: magic vars 7554 1726853165.75915: starting attempt loop 7554 1726853165.75919: running the handler 7554 1726853165.75930: handler run complete 7554 1726853165.75938: attempt loop complete, returning result 7554 1726853165.75941: _execute() done 7554 1726853165.75943: dumping result to json 7554 1726853165.75948: done dumping result, returning 7554 1726853165.75955: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-bdc3-98b6-000000000cf3] 7554 1726853165.75960: sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf3 7554 1726853165.76042: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf3 7554 1726853165.76046: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 7554 1726853165.76099: no more pending results, returning what we have 7554 1726853165.76102: results queue empty 7554 1726853165.76103: checking for any_errors_fatal 7554 1726853165.76113: done checking for any_errors_fatal 7554 1726853165.76114: checking for max_fail_percentage 7554 1726853165.76116: done checking for max_fail_percentage 7554 1726853165.76116: checking to see if all hosts have failed and the running result is not ok 7554 1726853165.76117: done checking to see if all hosts have failed 7554 1726853165.76118: getting the remaining hosts for this loop 7554 1726853165.76119: done getting the remaining hosts for this loop 7554 1726853165.76122: getting the next task for host managed_node3 7554 1726853165.76129: done getting next task for host managed_node3 7554 1726853165.76132: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 7554 1726853165.76136: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853165.76145: getting variables 7554 1726853165.76147: in VariableManager get_vars() 7554 1726853165.76185: Calling all_inventory to load vars for managed_node3 7554 1726853165.76187: Calling groups_inventory to load vars for managed_node3 7554 1726853165.76189: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853165.76197: Calling all_plugins_play to load vars for managed_node3 7554 1726853165.76200: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853165.76202: Calling groups_plugins_play to load vars for managed_node3 7554 1726853165.76934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853165.77775: done with get_vars() 7554 1726853165.77789: done getting variables 7554 1726853165.77831: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853165.77915: variable 'profile' from source: include params 7554 1726853165.77918: variable 'interface' from source: play vars 7554 1726853165.77960: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:26:05 -0400 (0:00:00.035) 0:00:19.747 ****** 7554 1726853165.77990: entering _queue_task() for managed_node3/command 7554 1726853165.78208: worker is 1 (out of 1 available) 7554 1726853165.78223: exiting _queue_task() for managed_node3/command 7554 1726853165.78233: done queuing things up, now waiting for results queue to drain 7554 1726853165.78234: waiting for pending results... 7554 1726853165.78412: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 7554 1726853165.78487: in run() - task 02083763-bbaf-bdc3-98b6-000000000cf5 7554 1726853165.78499: variable 'ansible_search_path' from source: unknown 7554 1726853165.78503: variable 'ansible_search_path' from source: unknown 7554 1726853165.78528: calling self._execute() 7554 1726853165.78608: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.78611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.78620: variable 'omit' from source: magic vars 7554 1726853165.78883: variable 'ansible_distribution_major_version' from source: facts 7554 1726853165.78898: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853165.78978: variable 'profile_stat' from source: set_fact 7554 1726853165.78991: Evaluated conditional (profile_stat.stat.exists): False 7554 1726853165.78994: when evaluation is False, skipping this task 7554 1726853165.78997: _execute() done 7554 1726853165.79001: dumping result to json 7554 1726853165.79003: done dumping result, returning 7554 1726853165.79016: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 [02083763-bbaf-bdc3-98b6-000000000cf5] 7554 1726853165.79020: sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf5 7554 1726853165.79098: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf5 7554 1726853165.79101: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7554 1726853165.79159: no more pending results, returning what we have 7554 1726853165.79162: results queue empty 7554 1726853165.79163: checking for any_errors_fatal 7554 1726853165.79172: done checking for any_errors_fatal 7554 1726853165.79173: checking for max_fail_percentage 7554 1726853165.79174: done checking for max_fail_percentage 7554 1726853165.79175: checking to see if all hosts have failed and the running result is not ok 7554 1726853165.79176: done checking to see if all hosts have failed 7554 1726853165.79177: getting the remaining hosts for this loop 7554 1726853165.79178: done getting the remaining hosts for this loop 7554 1726853165.79181: getting the next task for host managed_node3 7554 1726853165.79186: done getting next task for host managed_node3 7554 1726853165.79189: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 7554 1726853165.79192: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853165.79196: getting variables 7554 1726853165.79197: in VariableManager get_vars() 7554 1726853165.79235: Calling all_inventory to load vars for managed_node3 7554 1726853165.79238: Calling groups_inventory to load vars for managed_node3 7554 1726853165.79239: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853165.79249: Calling all_plugins_play to load vars for managed_node3 7554 1726853165.79251: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853165.79254: Calling groups_plugins_play to load vars for managed_node3 7554 1726853165.80090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853165.80930: done with get_vars() 7554 1726853165.80946: done getting variables 7554 1726853165.80989: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853165.81064: variable 'profile' from source: include params 7554 1726853165.81067: variable 'interface' from source: play vars 7554 1726853165.81107: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:26:05 -0400 (0:00:00.031) 0:00:19.778 ****** 7554 1726853165.81131: entering _queue_task() for managed_node3/set_fact 7554 1726853165.81354: worker is 1 (out of 1 available) 7554 1726853165.81372: exiting _queue_task() for managed_node3/set_fact 7554 1726853165.81384: done queuing things up, now waiting for results queue to drain 7554 1726853165.81386: waiting for pending results... 7554 1726853165.81559: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 7554 1726853165.81640: in run() - task 02083763-bbaf-bdc3-98b6-000000000cf6 7554 1726853165.81653: variable 'ansible_search_path' from source: unknown 7554 1726853165.81657: variable 'ansible_search_path' from source: unknown 7554 1726853165.81686: calling self._execute() 7554 1726853165.81761: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.81765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.81777: variable 'omit' from source: magic vars 7554 1726853165.82036: variable 'ansible_distribution_major_version' from source: facts 7554 1726853165.82049: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853165.82133: variable 'profile_stat' from source: set_fact 7554 1726853165.82147: Evaluated conditional (profile_stat.stat.exists): False 7554 1726853165.82150: when evaluation is False, skipping this task 7554 1726853165.82153: _execute() done 7554 1726853165.82155: dumping result to json 7554 1726853165.82157: done dumping result, returning 7554 1726853165.82169: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 [02083763-bbaf-bdc3-98b6-000000000cf6] 7554 1726853165.82174: sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf6 7554 1726853165.82259: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf6 7554 1726853165.82262: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7554 1726853165.82315: no more pending results, returning what we have 7554 1726853165.82319: results queue empty 7554 1726853165.82319: checking for any_errors_fatal 7554 1726853165.82326: done checking for any_errors_fatal 7554 1726853165.82326: checking for max_fail_percentage 7554 1726853165.82328: done checking for max_fail_percentage 7554 1726853165.82329: checking to see if all hosts have failed and the running result is not ok 7554 1726853165.82330: done checking to see if all hosts have failed 7554 1726853165.82330: getting the remaining hosts for this loop 7554 1726853165.82332: done getting the remaining hosts for this loop 7554 1726853165.82335: getting the next task for host managed_node3 7554 1726853165.82343: done getting next task for host managed_node3 7554 1726853165.82346: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 7554 1726853165.82349: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853165.82353: getting variables 7554 1726853165.82354: in VariableManager get_vars() 7554 1726853165.82398: Calling all_inventory to load vars for managed_node3 7554 1726853165.82400: Calling groups_inventory to load vars for managed_node3 7554 1726853165.82402: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853165.82414: Calling all_plugins_play to load vars for managed_node3 7554 1726853165.82416: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853165.82419: Calling groups_plugins_play to load vars for managed_node3 7554 1726853165.83181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853165.84112: done with get_vars() 7554 1726853165.84126: done getting variables 7554 1726853165.84168: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853165.84248: variable 'profile' from source: include params 7554 1726853165.84251: variable 'interface' from source: play vars 7554 1726853165.84290: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:26:05 -0400 (0:00:00.031) 0:00:19.810 ****** 7554 1726853165.84314: entering _queue_task() for managed_node3/command 7554 1726853165.84536: worker is 1 (out of 1 available) 7554 1726853165.84554: exiting _queue_task() for managed_node3/command 7554 1726853165.84565: done queuing things up, now waiting for results queue to drain 7554 1726853165.84567: waiting for pending results... 7554 1726853165.84743: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 7554 1726853165.84820: in run() - task 02083763-bbaf-bdc3-98b6-000000000cf7 7554 1726853165.84829: variable 'ansible_search_path' from source: unknown 7554 1726853165.84832: variable 'ansible_search_path' from source: unknown 7554 1726853165.84861: calling self._execute() 7554 1726853165.84962: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.84966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.84969: variable 'omit' from source: magic vars 7554 1726853165.85253: variable 'ansible_distribution_major_version' from source: facts 7554 1726853165.85297: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853165.85476: variable 'profile_stat' from source: set_fact 7554 1726853165.85480: Evaluated conditional (profile_stat.stat.exists): False 7554 1726853165.85482: when evaluation is False, skipping this task 7554 1726853165.85485: _execute() done 7554 1726853165.85487: dumping result to json 7554 1726853165.85489: done dumping result, returning 7554 1726853165.85491: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 [02083763-bbaf-bdc3-98b6-000000000cf7] 7554 1726853165.85493: sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf7 7554 1726853165.85556: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf7 7554 1726853165.85559: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7554 1726853165.85764: no more pending results, returning what we have 7554 1726853165.85768: results queue empty 7554 1726853165.85769: checking for any_errors_fatal 7554 1726853165.85777: done checking for any_errors_fatal 7554 1726853165.85778: checking for max_fail_percentage 7554 1726853165.85779: done checking for max_fail_percentage 7554 1726853165.85780: checking to see if all hosts have failed and the running result is not ok 7554 1726853165.85782: done checking to see if all hosts have failed 7554 1726853165.85782: getting the remaining hosts for this loop 7554 1726853165.85783: done getting the remaining hosts for this loop 7554 1726853165.85787: getting the next task for host managed_node3 7554 1726853165.85793: done getting next task for host managed_node3 7554 1726853165.85796: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 7554 1726853165.85799: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853165.85803: getting variables 7554 1726853165.85806: in VariableManager get_vars() 7554 1726853165.85951: Calling all_inventory to load vars for managed_node3 7554 1726853165.85954: Calling groups_inventory to load vars for managed_node3 7554 1726853165.85956: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853165.85966: Calling all_plugins_play to load vars for managed_node3 7554 1726853165.85968: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853165.85974: Calling groups_plugins_play to load vars for managed_node3 7554 1726853165.87209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853165.88698: done with get_vars() 7554 1726853165.88728: done getting variables 7554 1726853165.88797: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853165.88916: variable 'profile' from source: include params 7554 1726853165.88921: variable 'interface' from source: play vars 7554 1726853165.88984: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:26:05 -0400 (0:00:00.046) 0:00:19.857 ****** 7554 1726853165.89017: entering _queue_task() for managed_node3/set_fact 7554 1726853165.89352: worker is 1 (out of 1 available) 7554 1726853165.89364: exiting _queue_task() for managed_node3/set_fact 7554 1726853165.89580: done queuing things up, now waiting for results queue to drain 7554 1726853165.89582: waiting for pending results... 7554 1726853165.89697: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 7554 1726853165.89809: in run() - task 02083763-bbaf-bdc3-98b6-000000000cf8 7554 1726853165.89878: variable 'ansible_search_path' from source: unknown 7554 1726853165.89883: variable 'ansible_search_path' from source: unknown 7554 1726853165.89889: calling self._execute() 7554 1726853165.89986: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.90003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.90016: variable 'omit' from source: magic vars 7554 1726853165.90392: variable 'ansible_distribution_major_version' from source: facts 7554 1726853165.90413: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853165.90551: variable 'profile_stat' from source: set_fact 7554 1726853165.90653: Evaluated conditional (profile_stat.stat.exists): False 7554 1726853165.90657: when evaluation is False, skipping this task 7554 1726853165.90660: _execute() done 7554 1726853165.90662: dumping result to json 7554 1726853165.90665: done dumping result, returning 7554 1726853165.90667: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 [02083763-bbaf-bdc3-98b6-000000000cf8] 7554 1726853165.90669: sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf8 7554 1726853165.90743: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000cf8 7554 1726853165.90747: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7554 1726853165.90808: no more pending results, returning what we have 7554 1726853165.90812: results queue empty 7554 1726853165.90812: checking for any_errors_fatal 7554 1726853165.90818: done checking for any_errors_fatal 7554 1726853165.90819: checking for max_fail_percentage 7554 1726853165.90820: done checking for max_fail_percentage 7554 1726853165.90822: checking to see if all hosts have failed and the running result is not ok 7554 1726853165.90823: done checking to see if all hosts have failed 7554 1726853165.90824: getting the remaining hosts for this loop 7554 1726853165.90825: done getting the remaining hosts for this loop 7554 1726853165.90829: getting the next task for host managed_node3 7554 1726853165.90837: done getting next task for host managed_node3 7554 1726853165.90840: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 7554 1726853165.90843: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853165.90849: getting variables 7554 1726853165.90851: in VariableManager get_vars() 7554 1726853165.90908: Calling all_inventory to load vars for managed_node3 7554 1726853165.90911: Calling groups_inventory to load vars for managed_node3 7554 1726853165.90914: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853165.90928: Calling all_plugins_play to load vars for managed_node3 7554 1726853165.90931: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853165.90935: Calling groups_plugins_play to load vars for managed_node3 7554 1726853165.92187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853165.93015: done with get_vars() 7554 1726853165.93031: done getting variables 7554 1726853165.93075: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853165.93150: variable 'profile' from source: include params 7554 1726853165.93154: variable 'interface' from source: play vars 7554 1726853165.93195: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:26:05 -0400 (0:00:00.041) 0:00:19.899 ****** 7554 1726853165.93217: entering _queue_task() for managed_node3/assert 7554 1726853165.93426: worker is 1 (out of 1 available) 7554 1726853165.93440: exiting _queue_task() for managed_node3/assert 7554 1726853165.93450: done queuing things up, now waiting for results queue to drain 7554 1726853165.93452: waiting for pending results... 7554 1726853165.93630: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' 7554 1726853165.93708: in run() - task 02083763-bbaf-bdc3-98b6-000000000adf 7554 1726853165.93717: variable 'ansible_search_path' from source: unknown 7554 1726853165.93721: variable 'ansible_search_path' from source: unknown 7554 1726853165.93750: calling self._execute() 7554 1726853165.93828: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.93834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.93842: variable 'omit' from source: magic vars 7554 1726853165.94101: variable 'ansible_distribution_major_version' from source: facts 7554 1726853165.94112: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853165.94117: variable 'omit' from source: magic vars 7554 1726853165.94151: variable 'omit' from source: magic vars 7554 1726853165.94221: variable 'profile' from source: include params 7554 1726853165.94224: variable 'interface' from source: play vars 7554 1726853165.94273: variable 'interface' from source: play vars 7554 1726853165.94288: variable 'omit' from source: magic vars 7554 1726853165.94321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853165.94352: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853165.94368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853165.94383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853165.94393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853165.94417: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853165.94420: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.94422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.94496: Set connection var ansible_shell_executable to /bin/sh 7554 1726853165.94503: Set connection var ansible_pipelining to False 7554 1726853165.94506: Set connection var ansible_shell_type to sh 7554 1726853165.94508: Set connection var ansible_connection to ssh 7554 1726853165.94515: Set connection var ansible_timeout to 10 7554 1726853165.94520: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853165.94536: variable 'ansible_shell_executable' from source: unknown 7554 1726853165.94538: variable 'ansible_connection' from source: unknown 7554 1726853165.94542: variable 'ansible_module_compression' from source: unknown 7554 1726853165.94544: variable 'ansible_shell_type' from source: unknown 7554 1726853165.94549: variable 'ansible_shell_executable' from source: unknown 7554 1726853165.94552: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.94554: variable 'ansible_pipelining' from source: unknown 7554 1726853165.94557: variable 'ansible_timeout' from source: unknown 7554 1726853165.94562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.94676: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853165.94681: variable 'omit' from source: magic vars 7554 1726853165.94684: starting attempt loop 7554 1726853165.94686: running the handler 7554 1726853165.94751: variable 'lsr_net_profile_exists' from source: set_fact 7554 1726853165.94755: Evaluated conditional (lsr_net_profile_exists): True 7554 1726853165.94760: handler run complete 7554 1726853165.94774: attempt loop complete, returning result 7554 1726853165.94776: _execute() done 7554 1726853165.94779: dumping result to json 7554 1726853165.94782: done dumping result, returning 7554 1726853165.94787: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' [02083763-bbaf-bdc3-98b6-000000000adf] 7554 1726853165.94793: sending task result for task 02083763-bbaf-bdc3-98b6-000000000adf 7554 1726853165.94866: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000adf 7554 1726853165.94868: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7554 1726853165.94919: no more pending results, returning what we have 7554 1726853165.94922: results queue empty 7554 1726853165.94922: checking for any_errors_fatal 7554 1726853165.94929: done checking for any_errors_fatal 7554 1726853165.94929: checking for max_fail_percentage 7554 1726853165.94931: done checking for max_fail_percentage 7554 1726853165.94932: checking to see if all hosts have failed and the running result is not ok 7554 1726853165.94933: done checking to see if all hosts have failed 7554 1726853165.94933: getting the remaining hosts for this loop 7554 1726853165.94935: done getting the remaining hosts for this loop 7554 1726853165.94937: getting the next task for host managed_node3 7554 1726853165.94942: done getting next task for host managed_node3 7554 1726853165.94944: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 7554 1726853165.94947: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853165.94950: getting variables 7554 1726853165.94952: in VariableManager get_vars() 7554 1726853165.94994: Calling all_inventory to load vars for managed_node3 7554 1726853165.94997: Calling groups_inventory to load vars for managed_node3 7554 1726853165.94999: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853165.95008: Calling all_plugins_play to load vars for managed_node3 7554 1726853165.95011: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853165.95013: Calling groups_plugins_play to load vars for managed_node3 7554 1726853165.95758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853165.96599: done with get_vars() 7554 1726853165.96614: done getting variables 7554 1726853165.96654: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853165.96728: variable 'profile' from source: include params 7554 1726853165.96731: variable 'interface' from source: play vars 7554 1726853165.96767: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:26:05 -0400 (0:00:00.035) 0:00:19.935 ****** 7554 1726853165.96794: entering _queue_task() for managed_node3/assert 7554 1726853165.96987: worker is 1 (out of 1 available) 7554 1726853165.97003: exiting _queue_task() for managed_node3/assert 7554 1726853165.97014: done queuing things up, now waiting for results queue to drain 7554 1726853165.97015: waiting for pending results... 7554 1726853165.97188: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' 7554 1726853165.97257: in run() - task 02083763-bbaf-bdc3-98b6-000000000ae0 7554 1726853165.97269: variable 'ansible_search_path' from source: unknown 7554 1726853165.97273: variable 'ansible_search_path' from source: unknown 7554 1726853165.97299: calling self._execute() 7554 1726853165.97377: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.97383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.97391: variable 'omit' from source: magic vars 7554 1726853165.97642: variable 'ansible_distribution_major_version' from source: facts 7554 1726853165.97653: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853165.97659: variable 'omit' from source: magic vars 7554 1726853165.97691: variable 'omit' from source: magic vars 7554 1726853165.97758: variable 'profile' from source: include params 7554 1726853165.97762: variable 'interface' from source: play vars 7554 1726853165.97809: variable 'interface' from source: play vars 7554 1726853165.97824: variable 'omit' from source: magic vars 7554 1726853165.97857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853165.97885: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853165.97904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853165.97916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853165.97926: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853165.97952: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853165.97955: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.97957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.98025: Set connection var ansible_shell_executable to /bin/sh 7554 1726853165.98032: Set connection var ansible_pipelining to False 7554 1726853165.98035: Set connection var ansible_shell_type to sh 7554 1726853165.98037: Set connection var ansible_connection to ssh 7554 1726853165.98047: Set connection var ansible_timeout to 10 7554 1726853165.98052: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853165.98068: variable 'ansible_shell_executable' from source: unknown 7554 1726853165.98072: variable 'ansible_connection' from source: unknown 7554 1726853165.98075: variable 'ansible_module_compression' from source: unknown 7554 1726853165.98077: variable 'ansible_shell_type' from source: unknown 7554 1726853165.98079: variable 'ansible_shell_executable' from source: unknown 7554 1726853165.98081: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853165.98086: variable 'ansible_pipelining' from source: unknown 7554 1726853165.98088: variable 'ansible_timeout' from source: unknown 7554 1726853165.98092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853165.98190: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853165.98198: variable 'omit' from source: magic vars 7554 1726853165.98203: starting attempt loop 7554 1726853165.98206: running the handler 7554 1726853165.98281: variable 'lsr_net_profile_ansible_managed' from source: set_fact 7554 1726853165.98285: Evaluated conditional (lsr_net_profile_ansible_managed): True 7554 1726853165.98291: handler run complete 7554 1726853165.98301: attempt loop complete, returning result 7554 1726853165.98304: _execute() done 7554 1726853165.98306: dumping result to json 7554 1726853165.98309: done dumping result, returning 7554 1726853165.98314: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' [02083763-bbaf-bdc3-98b6-000000000ae0] 7554 1726853165.98321: sending task result for task 02083763-bbaf-bdc3-98b6-000000000ae0 7554 1726853165.98394: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000ae0 7554 1726853165.98397: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7554 1726853165.98479: no more pending results, returning what we have 7554 1726853165.98482: results queue empty 7554 1726853165.98483: checking for any_errors_fatal 7554 1726853165.98486: done checking for any_errors_fatal 7554 1726853165.98487: checking for max_fail_percentage 7554 1726853165.98488: done checking for max_fail_percentage 7554 1726853165.98489: checking to see if all hosts have failed and the running result is not ok 7554 1726853165.98490: done checking to see if all hosts have failed 7554 1726853165.98490: getting the remaining hosts for this loop 7554 1726853165.98491: done getting the remaining hosts for this loop 7554 1726853165.98494: getting the next task for host managed_node3 7554 1726853165.98500: done getting next task for host managed_node3 7554 1726853165.98502: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 7554 1726853165.98504: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853165.98507: getting variables 7554 1726853165.98508: in VariableManager get_vars() 7554 1726853165.98545: Calling all_inventory to load vars for managed_node3 7554 1726853165.98548: Calling groups_inventory to load vars for managed_node3 7554 1726853165.98550: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853165.98558: Calling all_plugins_play to load vars for managed_node3 7554 1726853165.98560: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853165.98563: Calling groups_plugins_play to load vars for managed_node3 7554 1726853166.02524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853166.03350: done with get_vars() 7554 1726853166.03366: done getting variables 7554 1726853166.03402: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853166.03469: variable 'profile' from source: include params 7554 1726853166.03473: variable 'interface' from source: play vars 7554 1726853166.03512: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:26:06 -0400 (0:00:00.067) 0:00:20.002 ****** 7554 1726853166.03534: entering _queue_task() for managed_node3/assert 7554 1726853166.03784: worker is 1 (out of 1 available) 7554 1726853166.03800: exiting _queue_task() for managed_node3/assert 7554 1726853166.03811: done queuing things up, now waiting for results queue to drain 7554 1726853166.03814: waiting for pending results... 7554 1726853166.03999: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 7554 1726853166.04082: in run() - task 02083763-bbaf-bdc3-98b6-000000000ae1 7554 1726853166.04094: variable 'ansible_search_path' from source: unknown 7554 1726853166.04098: variable 'ansible_search_path' from source: unknown 7554 1726853166.04126: calling self._execute() 7554 1726853166.04205: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.04210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.04220: variable 'omit' from source: magic vars 7554 1726853166.04486: variable 'ansible_distribution_major_version' from source: facts 7554 1726853166.04497: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853166.04502: variable 'omit' from source: magic vars 7554 1726853166.04526: variable 'omit' from source: magic vars 7554 1726853166.04597: variable 'profile' from source: include params 7554 1726853166.04601: variable 'interface' from source: play vars 7554 1726853166.04645: variable 'interface' from source: play vars 7554 1726853166.04662: variable 'omit' from source: magic vars 7554 1726853166.04700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853166.04724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853166.04741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853166.04756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853166.04768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853166.04792: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853166.04795: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.04799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.04867: Set connection var ansible_shell_executable to /bin/sh 7554 1726853166.04874: Set connection var ansible_pipelining to False 7554 1726853166.04877: Set connection var ansible_shell_type to sh 7554 1726853166.04880: Set connection var ansible_connection to ssh 7554 1726853166.04888: Set connection var ansible_timeout to 10 7554 1726853166.04891: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853166.04909: variable 'ansible_shell_executable' from source: unknown 7554 1726853166.04913: variable 'ansible_connection' from source: unknown 7554 1726853166.04916: variable 'ansible_module_compression' from source: unknown 7554 1726853166.04919: variable 'ansible_shell_type' from source: unknown 7554 1726853166.04922: variable 'ansible_shell_executable' from source: unknown 7554 1726853166.04924: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.04927: variable 'ansible_pipelining' from source: unknown 7554 1726853166.04929: variable 'ansible_timeout' from source: unknown 7554 1726853166.04932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.05030: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853166.05038: variable 'omit' from source: magic vars 7554 1726853166.05051: starting attempt loop 7554 1726853166.05054: running the handler 7554 1726853166.05121: variable 'lsr_net_profile_fingerprint' from source: set_fact 7554 1726853166.05124: Evaluated conditional (lsr_net_profile_fingerprint): True 7554 1726853166.05130: handler run complete 7554 1726853166.05140: attempt loop complete, returning result 7554 1726853166.05143: _execute() done 7554 1726853166.05147: dumping result to json 7554 1726853166.05152: done dumping result, returning 7554 1726853166.05163: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 [02083763-bbaf-bdc3-98b6-000000000ae1] 7554 1726853166.05165: sending task result for task 02083763-bbaf-bdc3-98b6-000000000ae1 7554 1726853166.05239: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000ae1 7554 1726853166.05241: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7554 1726853166.05303: no more pending results, returning what we have 7554 1726853166.05306: results queue empty 7554 1726853166.05307: checking for any_errors_fatal 7554 1726853166.05314: done checking for any_errors_fatal 7554 1726853166.05315: checking for max_fail_percentage 7554 1726853166.05316: done checking for max_fail_percentage 7554 1726853166.05317: checking to see if all hosts have failed and the running result is not ok 7554 1726853166.05318: done checking to see if all hosts have failed 7554 1726853166.05319: getting the remaining hosts for this loop 7554 1726853166.05320: done getting the remaining hosts for this loop 7554 1726853166.05323: getting the next task for host managed_node3 7554 1726853166.05330: done getting next task for host managed_node3 7554 1726853166.05332: ^ task is: TASK: Show ipv4 routes 7554 1726853166.05334: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853166.05338: getting variables 7554 1726853166.05340: in VariableManager get_vars() 7554 1726853166.05382: Calling all_inventory to load vars for managed_node3 7554 1726853166.05384: Calling groups_inventory to load vars for managed_node3 7554 1726853166.05386: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853166.05396: Calling all_plugins_play to load vars for managed_node3 7554 1726853166.05398: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853166.05401: Calling groups_plugins_play to load vars for managed_node3 7554 1726853166.06143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853166.07002: done with get_vars() 7554 1726853166.07016: done getting variables 7554 1726853166.07054: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ipv4 routes] ******************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:48 Friday 20 September 2024 13:26:06 -0400 (0:00:00.035) 0:00:20.038 ****** 7554 1726853166.07075: entering _queue_task() for managed_node3/command 7554 1726853166.07265: worker is 1 (out of 1 available) 7554 1726853166.07280: exiting _queue_task() for managed_node3/command 7554 1726853166.07291: done queuing things up, now waiting for results queue to drain 7554 1726853166.07293: waiting for pending results... 7554 1726853166.07458: running TaskExecutor() for managed_node3/TASK: Show ipv4 routes 7554 1726853166.07519: in run() - task 02083763-bbaf-bdc3-98b6-00000000005d 7554 1726853166.07529: variable 'ansible_search_path' from source: unknown 7554 1726853166.07562: calling self._execute() 7554 1726853166.07636: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.07644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.07655: variable 'omit' from source: magic vars 7554 1726853166.07912: variable 'ansible_distribution_major_version' from source: facts 7554 1726853166.07922: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853166.07927: variable 'omit' from source: magic vars 7554 1726853166.07942: variable 'omit' from source: magic vars 7554 1726853166.07971: variable 'omit' from source: magic vars 7554 1726853166.08007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853166.08035: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853166.08056: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853166.08069: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853166.08080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853166.08105: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853166.08108: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.08110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.08178: Set connection var ansible_shell_executable to /bin/sh 7554 1726853166.08186: Set connection var ansible_pipelining to False 7554 1726853166.08189: Set connection var ansible_shell_type to sh 7554 1726853166.08193: Set connection var ansible_connection to ssh 7554 1726853166.08201: Set connection var ansible_timeout to 10 7554 1726853166.08204: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853166.08221: variable 'ansible_shell_executable' from source: unknown 7554 1726853166.08224: variable 'ansible_connection' from source: unknown 7554 1726853166.08228: variable 'ansible_module_compression' from source: unknown 7554 1726853166.08230: variable 'ansible_shell_type' from source: unknown 7554 1726853166.08233: variable 'ansible_shell_executable' from source: unknown 7554 1726853166.08235: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.08237: variable 'ansible_pipelining' from source: unknown 7554 1726853166.08239: variable 'ansible_timeout' from source: unknown 7554 1726853166.08245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.08343: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853166.08353: variable 'omit' from source: magic vars 7554 1726853166.08358: starting attempt loop 7554 1726853166.08362: running the handler 7554 1726853166.08374: _low_level_execute_command(): starting 7554 1726853166.08383: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853166.08893: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853166.08897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853166.08901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853166.08953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853166.08956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853166.08960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853166.09025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853166.10729: stdout chunk (state=3): >>>/root <<< 7554 1726853166.10820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853166.10850: stderr chunk (state=3): >>><<< 7554 1726853166.10854: stdout chunk (state=3): >>><<< 7554 1726853166.10876: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853166.10888: _low_level_execute_command(): starting 7554 1726853166.10893: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853166.1087701-8345-21843946209689 `" && echo ansible-tmp-1726853166.1087701-8345-21843946209689="` echo /root/.ansible/tmp/ansible-tmp-1726853166.1087701-8345-21843946209689 `" ) && sleep 0' 7554 1726853166.11331: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853166.11334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853166.11337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853166.11346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853166.11348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853166.11396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853166.11403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853166.11462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853166.13437: stdout chunk (state=3): >>>ansible-tmp-1726853166.1087701-8345-21843946209689=/root/.ansible/tmp/ansible-tmp-1726853166.1087701-8345-21843946209689 <<< 7554 1726853166.13550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853166.13574: stderr chunk (state=3): >>><<< 7554 1726853166.13577: stdout chunk (state=3): >>><<< 7554 1726853166.13591: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853166.1087701-8345-21843946209689=/root/.ansible/tmp/ansible-tmp-1726853166.1087701-8345-21843946209689 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853166.13617: variable 'ansible_module_compression' from source: unknown 7554 1726853166.13660: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853166.13691: variable 'ansible_facts' from source: unknown 7554 1726853166.13751: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853166.1087701-8345-21843946209689/AnsiballZ_command.py 7554 1726853166.13847: Sending initial data 7554 1726853166.13850: Sent initial data (153 bytes) 7554 1726853166.14284: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853166.14290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853166.14292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853166.14295: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853166.14297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853166.14341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853166.14345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853166.14417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853166.16053: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7554 1726853166.16057: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853166.16109: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853166.16167: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpk6o4ig2j /root/.ansible/tmp/ansible-tmp-1726853166.1087701-8345-21843946209689/AnsiballZ_command.py <<< 7554 1726853166.16172: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853166.1087701-8345-21843946209689/AnsiballZ_command.py" <<< 7554 1726853166.16223: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpk6o4ig2j" to remote "/root/.ansible/tmp/ansible-tmp-1726853166.1087701-8345-21843946209689/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853166.1087701-8345-21843946209689/AnsiballZ_command.py" <<< 7554 1726853166.16819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853166.16854: stderr chunk (state=3): >>><<< 7554 1726853166.16857: stdout chunk (state=3): >>><<< 7554 1726853166.16874: done transferring module to remote 7554 1726853166.16884: _low_level_execute_command(): starting 7554 1726853166.16887: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853166.1087701-8345-21843946209689/ /root/.ansible/tmp/ansible-tmp-1726853166.1087701-8345-21843946209689/AnsiballZ_command.py && sleep 0' 7554 1726853166.17301: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853166.17305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853166.17307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853166.17309: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853166.17311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853166.17364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853166.17368: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853166.17373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853166.17432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853166.19286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853166.19307: stderr chunk (state=3): >>><<< 7554 1726853166.19310: stdout chunk (state=3): >>><<< 7554 1726853166.19321: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853166.19324: _low_level_execute_command(): starting 7554 1726853166.19329: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853166.1087701-8345-21843946209689/AnsiballZ_command.py && sleep 0' 7554 1726853166.19733: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853166.19736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853166.19739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853166.19743: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853166.19746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853166.19794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853166.19802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853166.19862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853166.36098: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \ndefault via 203.0.113.1 dev veth0 proto static metric 65535 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 65535 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-20 13:26:06.355092", "end": "2024-09-20 13:26:06.359141", "delta": "0:00:00.004049", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853166.37744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853166.37765: stderr chunk (state=3): >>><<< 7554 1726853166.37768: stdout chunk (state=3): >>><<< 7554 1726853166.37789: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \ndefault via 203.0.113.1 dev veth0 proto static metric 65535 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 65535 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-20 13:26:06.355092", "end": "2024-09-20 13:26:06.359141", "delta": "0:00:00.004049", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853166.37820: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853166.1087701-8345-21843946209689/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853166.37827: _low_level_execute_command(): starting 7554 1726853166.37832: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853166.1087701-8345-21843946209689/ > /dev/null 2>&1 && sleep 0' 7554 1726853166.38274: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853166.38277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853166.38285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853166.38287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853166.38330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853166.38337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853166.38339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853166.38398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853166.40324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853166.40328: stdout chunk (state=3): >>><<< 7554 1726853166.40330: stderr chunk (state=3): >>><<< 7554 1726853166.40477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853166.40480: handler run complete 7554 1726853166.40482: Evaluated conditional (False): False 7554 1726853166.40484: attempt loop complete, returning result 7554 1726853166.40486: _execute() done 7554 1726853166.40488: dumping result to json 7554 1726853166.40490: done dumping result, returning 7554 1726853166.40492: done running TaskExecutor() for managed_node3/TASK: Show ipv4 routes [02083763-bbaf-bdc3-98b6-00000000005d] 7554 1726853166.40494: sending task result for task 02083763-bbaf-bdc3-98b6-00000000005d 7554 1726853166.40565: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000005d 7554 1726853166.40568: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "route" ], "delta": "0:00:00.004049", "end": "2024-09-20 13:26:06.359141", "rc": 0, "start": "2024-09-20 13:26:06.355092" } STDOUT: default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 default via 203.0.113.1 dev veth0 proto static metric 65535 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 65535 7554 1726853166.40652: no more pending results, returning what we have 7554 1726853166.40655: results queue empty 7554 1726853166.40656: checking for any_errors_fatal 7554 1726853166.40661: done checking for any_errors_fatal 7554 1726853166.40662: checking for max_fail_percentage 7554 1726853166.40664: done checking for max_fail_percentage 7554 1726853166.40665: checking to see if all hosts have failed and the running result is not ok 7554 1726853166.40666: done checking to see if all hosts have failed 7554 1726853166.40667: getting the remaining hosts for this loop 7554 1726853166.40668: done getting the remaining hosts for this loop 7554 1726853166.40673: getting the next task for host managed_node3 7554 1726853166.40876: done getting next task for host managed_node3 7554 1726853166.40879: ^ task is: TASK: Assert default ipv4 route is present 7554 1726853166.40882: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853166.40886: getting variables 7554 1726853166.40888: in VariableManager get_vars() 7554 1726853166.40934: Calling all_inventory to load vars for managed_node3 7554 1726853166.40937: Calling groups_inventory to load vars for managed_node3 7554 1726853166.40939: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853166.40952: Calling all_plugins_play to load vars for managed_node3 7554 1726853166.40956: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853166.40959: Calling groups_plugins_play to load vars for managed_node3 7554 1726853166.42656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853166.44255: done with get_vars() 7554 1726853166.44283: done getting variables 7554 1726853166.44354: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv4 route is present] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:52 Friday 20 September 2024 13:26:06 -0400 (0:00:00.373) 0:00:20.411 ****** 7554 1726853166.44386: entering _queue_task() for managed_node3/assert 7554 1726853166.44745: worker is 1 (out of 1 available) 7554 1726853166.44758: exiting _queue_task() for managed_node3/assert 7554 1726853166.44772: done queuing things up, now waiting for results queue to drain 7554 1726853166.44879: waiting for pending results... 7554 1726853166.45191: running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is present 7554 1726853166.45197: in run() - task 02083763-bbaf-bdc3-98b6-00000000005e 7554 1726853166.45224: variable 'ansible_search_path' from source: unknown 7554 1726853166.45269: calling self._execute() 7554 1726853166.45383: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.45400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.45413: variable 'omit' from source: magic vars 7554 1726853166.45825: variable 'ansible_distribution_major_version' from source: facts 7554 1726853166.45862: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853166.45865: variable 'omit' from source: magic vars 7554 1726853166.45889: variable 'omit' from source: magic vars 7554 1726853166.45928: variable 'omit' from source: magic vars 7554 1726853166.45985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853166.46028: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853166.46080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853166.46085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853166.46103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853166.46176: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853166.46184: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.46190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.46273: Set connection var ansible_shell_executable to /bin/sh 7554 1726853166.46293: Set connection var ansible_pipelining to False 7554 1726853166.46304: Set connection var ansible_shell_type to sh 7554 1726853166.46312: Set connection var ansible_connection to ssh 7554 1726853166.46327: Set connection var ansible_timeout to 10 7554 1726853166.46406: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853166.46410: variable 'ansible_shell_executable' from source: unknown 7554 1726853166.46412: variable 'ansible_connection' from source: unknown 7554 1726853166.46415: variable 'ansible_module_compression' from source: unknown 7554 1726853166.46417: variable 'ansible_shell_type' from source: unknown 7554 1726853166.46419: variable 'ansible_shell_executable' from source: unknown 7554 1726853166.46421: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.46423: variable 'ansible_pipelining' from source: unknown 7554 1726853166.46425: variable 'ansible_timeout' from source: unknown 7554 1726853166.46427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.46565: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853166.46583: variable 'omit' from source: magic vars 7554 1726853166.46593: starting attempt loop 7554 1726853166.46600: running the handler 7554 1726853166.46764: variable '__test_str' from source: task vars 7554 1726853166.46840: variable 'interface' from source: play vars 7554 1726853166.46862: variable 'ipv4_routes' from source: set_fact 7554 1726853166.46950: Evaluated conditional (__test_str in ipv4_routes.stdout): True 7554 1726853166.46955: handler run complete 7554 1726853166.46957: attempt loop complete, returning result 7554 1726853166.46960: _execute() done 7554 1726853166.46962: dumping result to json 7554 1726853166.46964: done dumping result, returning 7554 1726853166.46966: done running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is present [02083763-bbaf-bdc3-98b6-00000000005e] 7554 1726853166.46968: sending task result for task 02083763-bbaf-bdc3-98b6-00000000005e 7554 1726853166.47126: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000005e 7554 1726853166.47129: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7554 1726853166.47184: no more pending results, returning what we have 7554 1726853166.47188: results queue empty 7554 1726853166.47189: checking for any_errors_fatal 7554 1726853166.47197: done checking for any_errors_fatal 7554 1726853166.47198: checking for max_fail_percentage 7554 1726853166.47200: done checking for max_fail_percentage 7554 1726853166.47201: checking to see if all hosts have failed and the running result is not ok 7554 1726853166.47202: done checking to see if all hosts have failed 7554 1726853166.47203: getting the remaining hosts for this loop 7554 1726853166.47205: done getting the remaining hosts for this loop 7554 1726853166.47208: getting the next task for host managed_node3 7554 1726853166.47215: done getting next task for host managed_node3 7554 1726853166.47217: ^ task is: TASK: Get ipv6 routes 7554 1726853166.47219: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853166.47224: getting variables 7554 1726853166.47226: in VariableManager get_vars() 7554 1726853166.47284: Calling all_inventory to load vars for managed_node3 7554 1726853166.47287: Calling groups_inventory to load vars for managed_node3 7554 1726853166.47290: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853166.47302: Calling all_plugins_play to load vars for managed_node3 7554 1726853166.47305: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853166.47308: Calling groups_plugins_play to load vars for managed_node3 7554 1726853166.48601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853166.49561: done with get_vars() 7554 1726853166.49577: done getting variables 7554 1726853166.49619: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:57 Friday 20 September 2024 13:26:06 -0400 (0:00:00.052) 0:00:20.463 ****** 7554 1726853166.49639: entering _queue_task() for managed_node3/command 7554 1726853166.49867: worker is 1 (out of 1 available) 7554 1726853166.49883: exiting _queue_task() for managed_node3/command 7554 1726853166.49896: done queuing things up, now waiting for results queue to drain 7554 1726853166.49897: waiting for pending results... 7554 1726853166.50287: running TaskExecutor() for managed_node3/TASK: Get ipv6 routes 7554 1726853166.50292: in run() - task 02083763-bbaf-bdc3-98b6-00000000005f 7554 1726853166.50296: variable 'ansible_search_path' from source: unknown 7554 1726853166.50300: calling self._execute() 7554 1726853166.50368: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.50394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.50410: variable 'omit' from source: magic vars 7554 1726853166.50800: variable 'ansible_distribution_major_version' from source: facts 7554 1726853166.50821: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853166.50890: variable 'omit' from source: magic vars 7554 1726853166.50893: variable 'omit' from source: magic vars 7554 1726853166.50952: variable 'omit' from source: magic vars 7554 1726853166.51002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853166.51052: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853166.51087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853166.51100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853166.51110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853166.51134: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853166.51147: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.51169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.51237: Set connection var ansible_shell_executable to /bin/sh 7554 1726853166.51246: Set connection var ansible_pipelining to False 7554 1726853166.51249: Set connection var ansible_shell_type to sh 7554 1726853166.51251: Set connection var ansible_connection to ssh 7554 1726853166.51259: Set connection var ansible_timeout to 10 7554 1726853166.51264: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853166.51285: variable 'ansible_shell_executable' from source: unknown 7554 1726853166.51288: variable 'ansible_connection' from source: unknown 7554 1726853166.51291: variable 'ansible_module_compression' from source: unknown 7554 1726853166.51294: variable 'ansible_shell_type' from source: unknown 7554 1726853166.51296: variable 'ansible_shell_executable' from source: unknown 7554 1726853166.51298: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.51300: variable 'ansible_pipelining' from source: unknown 7554 1726853166.51303: variable 'ansible_timeout' from source: unknown 7554 1726853166.51308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.51414: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853166.51424: variable 'omit' from source: magic vars 7554 1726853166.51428: starting attempt loop 7554 1726853166.51430: running the handler 7554 1726853166.51447: _low_level_execute_command(): starting 7554 1726853166.51453: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853166.51951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853166.51956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853166.51961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853166.52013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853166.52020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853166.52022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853166.52089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853166.53803: stdout chunk (state=3): >>>/root <<< 7554 1726853166.53965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853166.53968: stdout chunk (state=3): >>><<< 7554 1726853166.53973: stderr chunk (state=3): >>><<< 7554 1726853166.53997: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853166.54096: _low_level_execute_command(): starting 7554 1726853166.54099: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853166.5400302-8357-84116509035016 `" && echo ansible-tmp-1726853166.5400302-8357-84116509035016="` echo /root/.ansible/tmp/ansible-tmp-1726853166.5400302-8357-84116509035016 `" ) && sleep 0' 7554 1726853166.54675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853166.54690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853166.54704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853166.54789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853166.54852: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853166.54869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853166.54891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853166.54985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853166.56958: stdout chunk (state=3): >>>ansible-tmp-1726853166.5400302-8357-84116509035016=/root/.ansible/tmp/ansible-tmp-1726853166.5400302-8357-84116509035016 <<< 7554 1726853166.57127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853166.57131: stdout chunk (state=3): >>><<< 7554 1726853166.57134: stderr chunk (state=3): >>><<< 7554 1726853166.57277: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853166.5400302-8357-84116509035016=/root/.ansible/tmp/ansible-tmp-1726853166.5400302-8357-84116509035016 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853166.57280: variable 'ansible_module_compression' from source: unknown 7554 1726853166.57283: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853166.57303: variable 'ansible_facts' from source: unknown 7554 1726853166.57406: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853166.5400302-8357-84116509035016/AnsiballZ_command.py 7554 1726853166.57638: Sending initial data 7554 1726853166.57644: Sent initial data (153 bytes) 7554 1726853166.58236: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853166.58291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853166.58313: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7554 1726853166.58389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853166.58421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853166.58437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853166.58461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853166.58563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853166.60247: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853166.60324: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853166.60400: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp8zajvdfj /root/.ansible/tmp/ansible-tmp-1726853166.5400302-8357-84116509035016/AnsiballZ_command.py <<< 7554 1726853166.60403: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853166.5400302-8357-84116509035016/AnsiballZ_command.py" <<< 7554 1726853166.60467: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp8zajvdfj" to remote "/root/.ansible/tmp/ansible-tmp-1726853166.5400302-8357-84116509035016/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853166.5400302-8357-84116509035016/AnsiballZ_command.py" <<< 7554 1726853166.61326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853166.61329: stdout chunk (state=3): >>><<< 7554 1726853166.61337: stderr chunk (state=3): >>><<< 7554 1726853166.61376: done transferring module to remote 7554 1726853166.61379: _low_level_execute_command(): starting 7554 1726853166.61390: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853166.5400302-8357-84116509035016/ /root/.ansible/tmp/ansible-tmp-1726853166.5400302-8357-84116509035016/AnsiballZ_command.py && sleep 0' 7554 1726853166.62025: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853166.62040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853166.62056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853166.62077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853166.62185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853166.62217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853166.62317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853166.64266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853166.64273: stdout chunk (state=3): >>><<< 7554 1726853166.64275: stderr chunk (state=3): >>><<< 7554 1726853166.64374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853166.64378: _low_level_execute_command(): starting 7554 1726853166.64381: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853166.5400302-8357-84116509035016/AnsiballZ_command.py && sleep 0' 7554 1726853166.64960: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853166.64976: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853166.65053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853166.65102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853166.65116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853166.65148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853166.65250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853166.81376: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 13:26:06.808283", "end": "2024-09-20 13:26:06.812145", "delta": "0:00:00.003862", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853166.83036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853166.83065: stderr chunk (state=3): >>><<< 7554 1726853166.83068: stdout chunk (state=3): >>><<< 7554 1726853166.83091: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 13:26:06.808283", "end": "2024-09-20 13:26:06.812145", "delta": "0:00:00.003862", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853166.83118: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853166.5400302-8357-84116509035016/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853166.83126: _low_level_execute_command(): starting 7554 1726853166.83131: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853166.5400302-8357-84116509035016/ > /dev/null 2>&1 && sleep 0' 7554 1726853166.83555: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853166.83595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853166.83598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853166.83600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853166.83602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853166.83609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853166.83611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853166.83656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853166.83659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853166.83665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853166.83723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853166.85591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853166.85613: stderr chunk (state=3): >>><<< 7554 1726853166.85616: stdout chunk (state=3): >>><<< 7554 1726853166.85628: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853166.85634: handler run complete 7554 1726853166.85655: Evaluated conditional (False): False 7554 1726853166.85665: attempt loop complete, returning result 7554 1726853166.85668: _execute() done 7554 1726853166.85673: dumping result to json 7554 1726853166.85677: done dumping result, returning 7554 1726853166.85684: done running TaskExecutor() for managed_node3/TASK: Get ipv6 routes [02083763-bbaf-bdc3-98b6-00000000005f] 7554 1726853166.85689: sending task result for task 02083763-bbaf-bdc3-98b6-00000000005f 7554 1726853166.85783: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000005f 7554 1726853166.85786: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003862", "end": "2024-09-20 13:26:06.812145", "rc": 0, "start": "2024-09-20 13:26:06.808283" } STDOUT: 2001:db8::/64 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev peerveth0 proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium default via 2001:db8::1 dev veth0 proto static metric 101 pref medium 7554 1726853166.85857: no more pending results, returning what we have 7554 1726853166.85860: results queue empty 7554 1726853166.85861: checking for any_errors_fatal 7554 1726853166.85867: done checking for any_errors_fatal 7554 1726853166.85867: checking for max_fail_percentage 7554 1726853166.85869: done checking for max_fail_percentage 7554 1726853166.85870: checking to see if all hosts have failed and the running result is not ok 7554 1726853166.85877: done checking to see if all hosts have failed 7554 1726853166.85877: getting the remaining hosts for this loop 7554 1726853166.85879: done getting the remaining hosts for this loop 7554 1726853166.85882: getting the next task for host managed_node3 7554 1726853166.85887: done getting next task for host managed_node3 7554 1726853166.85890: ^ task is: TASK: Assert default ipv6 route is present 7554 1726853166.85892: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853166.85900: getting variables 7554 1726853166.85902: in VariableManager get_vars() 7554 1726853166.85948: Calling all_inventory to load vars for managed_node3 7554 1726853166.85950: Calling groups_inventory to load vars for managed_node3 7554 1726853166.85952: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853166.85962: Calling all_plugins_play to load vars for managed_node3 7554 1726853166.85964: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853166.85967: Calling groups_plugins_play to load vars for managed_node3 7554 1726853166.86759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853166.87609: done with get_vars() 7554 1726853166.87624: done getting variables 7554 1726853166.87667: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is present] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:61 Friday 20 September 2024 13:26:06 -0400 (0:00:00.380) 0:00:20.844 ****** 7554 1726853166.87690: entering _queue_task() for managed_node3/assert 7554 1726853166.87904: worker is 1 (out of 1 available) 7554 1726853166.87920: exiting _queue_task() for managed_node3/assert 7554 1726853166.87931: done queuing things up, now waiting for results queue to drain 7554 1726853166.87933: waiting for pending results... 7554 1726853166.88114: running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is present 7554 1726853166.88181: in run() - task 02083763-bbaf-bdc3-98b6-000000000060 7554 1726853166.88194: variable 'ansible_search_path' from source: unknown 7554 1726853166.88223: calling self._execute() 7554 1726853166.88303: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.88306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.88315: variable 'omit' from source: magic vars 7554 1726853166.88586: variable 'ansible_distribution_major_version' from source: facts 7554 1726853166.88600: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853166.88677: variable 'network_provider' from source: set_fact 7554 1726853166.88681: Evaluated conditional (network_provider == "nm"): True 7554 1726853166.88688: variable 'omit' from source: magic vars 7554 1726853166.88708: variable 'omit' from source: magic vars 7554 1726853166.88731: variable 'omit' from source: magic vars 7554 1726853166.88765: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853166.88794: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853166.88817: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853166.88827: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853166.88835: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853166.88861: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853166.88864: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.88867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.88937: Set connection var ansible_shell_executable to /bin/sh 7554 1726853166.88945: Set connection var ansible_pipelining to False 7554 1726853166.88949: Set connection var ansible_shell_type to sh 7554 1726853166.88951: Set connection var ansible_connection to ssh 7554 1726853166.88960: Set connection var ansible_timeout to 10 7554 1726853166.88965: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853166.88984: variable 'ansible_shell_executable' from source: unknown 7554 1726853166.88987: variable 'ansible_connection' from source: unknown 7554 1726853166.88989: variable 'ansible_module_compression' from source: unknown 7554 1726853166.88992: variable 'ansible_shell_type' from source: unknown 7554 1726853166.88994: variable 'ansible_shell_executable' from source: unknown 7554 1726853166.88996: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.88999: variable 'ansible_pipelining' from source: unknown 7554 1726853166.89001: variable 'ansible_timeout' from source: unknown 7554 1726853166.89006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.89109: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853166.89118: variable 'omit' from source: magic vars 7554 1726853166.89122: starting attempt loop 7554 1726853166.89125: running the handler 7554 1726853166.89218: variable '__test_str' from source: task vars 7554 1726853166.89269: variable 'interface' from source: play vars 7554 1726853166.89279: variable 'ipv6_route' from source: set_fact 7554 1726853166.89289: Evaluated conditional (__test_str in ipv6_route.stdout): True 7554 1726853166.89294: handler run complete 7554 1726853166.89305: attempt loop complete, returning result 7554 1726853166.89307: _execute() done 7554 1726853166.89310: dumping result to json 7554 1726853166.89312: done dumping result, returning 7554 1726853166.89319: done running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is present [02083763-bbaf-bdc3-98b6-000000000060] 7554 1726853166.89324: sending task result for task 02083763-bbaf-bdc3-98b6-000000000060 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7554 1726853166.89454: no more pending results, returning what we have 7554 1726853166.89457: results queue empty 7554 1726853166.89458: checking for any_errors_fatal 7554 1726853166.89467: done checking for any_errors_fatal 7554 1726853166.89467: checking for max_fail_percentage 7554 1726853166.89468: done checking for max_fail_percentage 7554 1726853166.89469: checking to see if all hosts have failed and the running result is not ok 7554 1726853166.89472: done checking to see if all hosts have failed 7554 1726853166.89473: getting the remaining hosts for this loop 7554 1726853166.89474: done getting the remaining hosts for this loop 7554 1726853166.89477: getting the next task for host managed_node3 7554 1726853166.89484: done getting next task for host managed_node3 7554 1726853166.89486: ^ task is: TASK: TEARDOWN: remove profiles. 7554 1726853166.89488: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853166.89491: getting variables 7554 1726853166.89492: in VariableManager get_vars() 7554 1726853166.89532: Calling all_inventory to load vars for managed_node3 7554 1726853166.89534: Calling groups_inventory to load vars for managed_node3 7554 1726853166.89536: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853166.89545: Calling all_plugins_play to load vars for managed_node3 7554 1726853166.89548: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853166.89550: Calling groups_plugins_play to load vars for managed_node3 7554 1726853166.90453: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000060 7554 1726853166.90461: WORKER PROCESS EXITING 7554 1726853166.90473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853166.91306: done with get_vars() 7554 1726853166.91323: done getting variables 7554 1726853166.91364: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:67 Friday 20 September 2024 13:26:06 -0400 (0:00:00.036) 0:00:20.881 ****** 7554 1726853166.91385: entering _queue_task() for managed_node3/debug 7554 1726853166.91606: worker is 1 (out of 1 available) 7554 1726853166.91620: exiting _queue_task() for managed_node3/debug 7554 1726853166.91632: done queuing things up, now waiting for results queue to drain 7554 1726853166.91633: waiting for pending results... 7554 1726853166.91991: running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. 7554 1726853166.92032: in run() - task 02083763-bbaf-bdc3-98b6-000000000061 7554 1726853166.92054: variable 'ansible_search_path' from source: unknown 7554 1726853166.92098: calling self._execute() 7554 1726853166.92277: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.92281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.92284: variable 'omit' from source: magic vars 7554 1726853166.92650: variable 'ansible_distribution_major_version' from source: facts 7554 1726853166.92675: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853166.92768: variable 'omit' from source: magic vars 7554 1726853166.92774: variable 'omit' from source: magic vars 7554 1726853166.92777: variable 'omit' from source: magic vars 7554 1726853166.92802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853166.92846: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853166.92881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853166.92904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853166.93279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853166.93282: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853166.93285: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.93288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.93290: Set connection var ansible_shell_executable to /bin/sh 7554 1726853166.93293: Set connection var ansible_pipelining to False 7554 1726853166.93295: Set connection var ansible_shell_type to sh 7554 1726853166.93298: Set connection var ansible_connection to ssh 7554 1726853166.93299: Set connection var ansible_timeout to 10 7554 1726853166.93301: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853166.93397: variable 'ansible_shell_executable' from source: unknown 7554 1726853166.93408: variable 'ansible_connection' from source: unknown 7554 1726853166.93417: variable 'ansible_module_compression' from source: unknown 7554 1726853166.93427: variable 'ansible_shell_type' from source: unknown 7554 1726853166.93436: variable 'ansible_shell_executable' from source: unknown 7554 1726853166.93444: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.93453: variable 'ansible_pipelining' from source: unknown 7554 1726853166.93461: variable 'ansible_timeout' from source: unknown 7554 1726853166.93491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.94043: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853166.94047: variable 'omit' from source: magic vars 7554 1726853166.94050: starting attempt loop 7554 1726853166.94052: running the handler 7554 1726853166.94055: handler run complete 7554 1726853166.94057: attempt loop complete, returning result 7554 1726853166.94058: _execute() done 7554 1726853166.94061: dumping result to json 7554 1726853166.94063: done dumping result, returning 7554 1726853166.94068: done running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. [02083763-bbaf-bdc3-98b6-000000000061] 7554 1726853166.94083: sending task result for task 02083763-bbaf-bdc3-98b6-000000000061 ok: [managed_node3] => {} MSG: ################################################## 7554 1726853166.94232: no more pending results, returning what we have 7554 1726853166.94236: results queue empty 7554 1726853166.94237: checking for any_errors_fatal 7554 1726853166.94244: done checking for any_errors_fatal 7554 1726853166.94245: checking for max_fail_percentage 7554 1726853166.94247: done checking for max_fail_percentage 7554 1726853166.94248: checking to see if all hosts have failed and the running result is not ok 7554 1726853166.94249: done checking to see if all hosts have failed 7554 1726853166.94250: getting the remaining hosts for this loop 7554 1726853166.94251: done getting the remaining hosts for this loop 7554 1726853166.94261: getting the next task for host managed_node3 7554 1726853166.94269: done getting next task for host managed_node3 7554 1726853166.94276: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7554 1726853166.94280: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853166.94302: getting variables 7554 1726853166.94303: in VariableManager get_vars() 7554 1726853166.94356: Calling all_inventory to load vars for managed_node3 7554 1726853166.94359: Calling groups_inventory to load vars for managed_node3 7554 1726853166.94362: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853166.94642: Calling all_plugins_play to load vars for managed_node3 7554 1726853166.94647: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853166.94651: Calling groups_plugins_play to load vars for managed_node3 7554 1726853166.95384: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000061 7554 1726853166.95388: WORKER PROCESS EXITING 7554 1726853166.96096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853166.98313: done with get_vars() 7554 1726853166.98340: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:26:06 -0400 (0:00:00.070) 0:00:20.951 ****** 7554 1726853166.98438: entering _queue_task() for managed_node3/include_tasks 7554 1726853166.98739: worker is 1 (out of 1 available) 7554 1726853166.98752: exiting _queue_task() for managed_node3/include_tasks 7554 1726853166.98764: done queuing things up, now waiting for results queue to drain 7554 1726853166.98766: waiting for pending results... 7554 1726853166.99053: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7554 1726853166.99201: in run() - task 02083763-bbaf-bdc3-98b6-000000000069 7554 1726853166.99222: variable 'ansible_search_path' from source: unknown 7554 1726853166.99229: variable 'ansible_search_path' from source: unknown 7554 1726853166.99266: calling self._execute() 7554 1726853166.99365: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853166.99379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853166.99392: variable 'omit' from source: magic vars 7554 1726853166.99756: variable 'ansible_distribution_major_version' from source: facts 7554 1726853166.99774: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853166.99785: _execute() done 7554 1726853166.99792: dumping result to json 7554 1726853166.99799: done dumping result, returning 7554 1726853166.99810: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-bdc3-98b6-000000000069] 7554 1726853166.99820: sending task result for task 02083763-bbaf-bdc3-98b6-000000000069 7554 1726853166.99926: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000069 7554 1726853166.99934: WORKER PROCESS EXITING 7554 1726853166.99994: no more pending results, returning what we have 7554 1726853166.99999: in VariableManager get_vars() 7554 1726853167.00054: Calling all_inventory to load vars for managed_node3 7554 1726853167.00057: Calling groups_inventory to load vars for managed_node3 7554 1726853167.00060: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853167.00074: Calling all_plugins_play to load vars for managed_node3 7554 1726853167.00077: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853167.00080: Calling groups_plugins_play to load vars for managed_node3 7554 1726853167.01716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853167.03192: done with get_vars() 7554 1726853167.03215: variable 'ansible_search_path' from source: unknown 7554 1726853167.03216: variable 'ansible_search_path' from source: unknown 7554 1726853167.03258: we have included files to process 7554 1726853167.03259: generating all_blocks data 7554 1726853167.03261: done generating all_blocks data 7554 1726853167.03265: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7554 1726853167.03267: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7554 1726853167.03269: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7554 1726853167.03799: done processing included file 7554 1726853167.03801: iterating over new_blocks loaded from include file 7554 1726853167.03803: in VariableManager get_vars() 7554 1726853167.03833: done with get_vars() 7554 1726853167.03835: filtering new block on tags 7554 1726853167.03851: done filtering new block on tags 7554 1726853167.03853: in VariableManager get_vars() 7554 1726853167.03880: done with get_vars() 7554 1726853167.03883: filtering new block on tags 7554 1726853167.03901: done filtering new block on tags 7554 1726853167.03903: in VariableManager get_vars() 7554 1726853167.03928: done with get_vars() 7554 1726853167.03929: filtering new block on tags 7554 1726853167.03943: done filtering new block on tags 7554 1726853167.03945: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7554 1726853167.03950: extending task lists for all hosts with included blocks 7554 1726853167.04698: done extending task lists 7554 1726853167.04700: done processing included files 7554 1726853167.04701: results queue empty 7554 1726853167.04702: checking for any_errors_fatal 7554 1726853167.04705: done checking for any_errors_fatal 7554 1726853167.04706: checking for max_fail_percentage 7554 1726853167.04707: done checking for max_fail_percentage 7554 1726853167.04708: checking to see if all hosts have failed and the running result is not ok 7554 1726853167.04709: done checking to see if all hosts have failed 7554 1726853167.04709: getting the remaining hosts for this loop 7554 1726853167.04710: done getting the remaining hosts for this loop 7554 1726853167.04713: getting the next task for host managed_node3 7554 1726853167.04717: done getting next task for host managed_node3 7554 1726853167.04719: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7554 1726853167.04722: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853167.04732: getting variables 7554 1726853167.04733: in VariableManager get_vars() 7554 1726853167.04751: Calling all_inventory to load vars for managed_node3 7554 1726853167.04754: Calling groups_inventory to load vars for managed_node3 7554 1726853167.04756: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853167.04761: Calling all_plugins_play to load vars for managed_node3 7554 1726853167.04764: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853167.04767: Calling groups_plugins_play to load vars for managed_node3 7554 1726853167.05879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853167.07368: done with get_vars() 7554 1726853167.07389: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:26:07 -0400 (0:00:00.090) 0:00:21.042 ****** 7554 1726853167.07462: entering _queue_task() for managed_node3/setup 7554 1726853167.07787: worker is 1 (out of 1 available) 7554 1726853167.07798: exiting _queue_task() for managed_node3/setup 7554 1726853167.07811: done queuing things up, now waiting for results queue to drain 7554 1726853167.07812: waiting for pending results... 7554 1726853167.08190: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7554 1726853167.08269: in run() - task 02083763-bbaf-bdc3-98b6-000000000d46 7554 1726853167.08295: variable 'ansible_search_path' from source: unknown 7554 1726853167.08302: variable 'ansible_search_path' from source: unknown 7554 1726853167.08339: calling self._execute() 7554 1726853167.08435: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853167.08447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853167.08459: variable 'omit' from source: magic vars 7554 1726853167.08822: variable 'ansible_distribution_major_version' from source: facts 7554 1726853167.08841: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853167.09058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853167.11109: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853167.11193: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853167.11239: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853167.11281: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853167.11318: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853167.11400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853167.11440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853167.11543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853167.11547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853167.11549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853167.11595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853167.11625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853167.11658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853167.11704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853167.11724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853167.11893: variable '__network_required_facts' from source: role '' defaults 7554 1726853167.11909: variable 'ansible_facts' from source: unknown 7554 1726853167.12651: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7554 1726853167.12659: when evaluation is False, skipping this task 7554 1726853167.12667: _execute() done 7554 1726853167.12674: dumping result to json 7554 1726853167.12737: done dumping result, returning 7554 1726853167.12740: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-bdc3-98b6-000000000d46] 7554 1726853167.12742: sending task result for task 02083763-bbaf-bdc3-98b6-000000000d46 7554 1726853167.12809: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000d46 7554 1726853167.12812: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853167.12884: no more pending results, returning what we have 7554 1726853167.12887: results queue empty 7554 1726853167.12888: checking for any_errors_fatal 7554 1726853167.12890: done checking for any_errors_fatal 7554 1726853167.12891: checking for max_fail_percentage 7554 1726853167.12893: done checking for max_fail_percentage 7554 1726853167.12893: checking to see if all hosts have failed and the running result is not ok 7554 1726853167.12895: done checking to see if all hosts have failed 7554 1726853167.12895: getting the remaining hosts for this loop 7554 1726853167.12897: done getting the remaining hosts for this loop 7554 1726853167.12900: getting the next task for host managed_node3 7554 1726853167.12910: done getting next task for host managed_node3 7554 1726853167.12913: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7554 1726853167.12916: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853167.12933: getting variables 7554 1726853167.12935: in VariableManager get_vars() 7554 1726853167.12987: Calling all_inventory to load vars for managed_node3 7554 1726853167.12989: Calling groups_inventory to load vars for managed_node3 7554 1726853167.12992: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853167.13003: Calling all_plugins_play to load vars for managed_node3 7554 1726853167.13005: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853167.13008: Calling groups_plugins_play to load vars for managed_node3 7554 1726853167.14553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853167.16019: done with get_vars() 7554 1726853167.16040: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:26:07 -0400 (0:00:00.086) 0:00:21.128 ****** 7554 1726853167.16145: entering _queue_task() for managed_node3/stat 7554 1726853167.16451: worker is 1 (out of 1 available) 7554 1726853167.16463: exiting _queue_task() for managed_node3/stat 7554 1726853167.16477: done queuing things up, now waiting for results queue to drain 7554 1726853167.16479: waiting for pending results... 7554 1726853167.16888: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7554 1726853167.16932: in run() - task 02083763-bbaf-bdc3-98b6-000000000d48 7554 1726853167.16952: variable 'ansible_search_path' from source: unknown 7554 1726853167.16960: variable 'ansible_search_path' from source: unknown 7554 1726853167.17003: calling self._execute() 7554 1726853167.17104: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853167.17119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853167.17134: variable 'omit' from source: magic vars 7554 1726853167.17504: variable 'ansible_distribution_major_version' from source: facts 7554 1726853167.17553: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853167.17703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853167.17981: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853167.18034: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853167.18070: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853167.18206: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853167.18209: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853167.18232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853167.18261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853167.18292: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853167.18387: variable '__network_is_ostree' from source: set_fact 7554 1726853167.18399: Evaluated conditional (not __network_is_ostree is defined): False 7554 1726853167.18406: when evaluation is False, skipping this task 7554 1726853167.18413: _execute() done 7554 1726853167.18423: dumping result to json 7554 1726853167.18430: done dumping result, returning 7554 1726853167.18442: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-bdc3-98b6-000000000d48] 7554 1726853167.18453: sending task result for task 02083763-bbaf-bdc3-98b6-000000000d48 7554 1726853167.18701: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000d48 7554 1726853167.18704: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7554 1726853167.18755: no more pending results, returning what we have 7554 1726853167.18758: results queue empty 7554 1726853167.18759: checking for any_errors_fatal 7554 1726853167.18769: done checking for any_errors_fatal 7554 1726853167.18770: checking for max_fail_percentage 7554 1726853167.18774: done checking for max_fail_percentage 7554 1726853167.18774: checking to see if all hosts have failed and the running result is not ok 7554 1726853167.18776: done checking to see if all hosts have failed 7554 1726853167.18776: getting the remaining hosts for this loop 7554 1726853167.18778: done getting the remaining hosts for this loop 7554 1726853167.18782: getting the next task for host managed_node3 7554 1726853167.18790: done getting next task for host managed_node3 7554 1726853167.18794: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7554 1726853167.18798: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853167.18816: getting variables 7554 1726853167.18818: in VariableManager get_vars() 7554 1726853167.18869: Calling all_inventory to load vars for managed_node3 7554 1726853167.18975: Calling groups_inventory to load vars for managed_node3 7554 1726853167.18978: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853167.18988: Calling all_plugins_play to load vars for managed_node3 7554 1726853167.18991: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853167.18994: Calling groups_plugins_play to load vars for managed_node3 7554 1726853167.20474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853167.21953: done with get_vars() 7554 1726853167.21981: done getting variables 7554 1726853167.22042: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:26:07 -0400 (0:00:00.059) 0:00:21.188 ****** 7554 1726853167.22081: entering _queue_task() for managed_node3/set_fact 7554 1726853167.22422: worker is 1 (out of 1 available) 7554 1726853167.22435: exiting _queue_task() for managed_node3/set_fact 7554 1726853167.22447: done queuing things up, now waiting for results queue to drain 7554 1726853167.22448: waiting for pending results... 7554 1726853167.22740: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7554 1726853167.22895: in run() - task 02083763-bbaf-bdc3-98b6-000000000d49 7554 1726853167.22916: variable 'ansible_search_path' from source: unknown 7554 1726853167.22922: variable 'ansible_search_path' from source: unknown 7554 1726853167.22959: calling self._execute() 7554 1726853167.23062: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853167.23075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853167.23089: variable 'omit' from source: magic vars 7554 1726853167.23449: variable 'ansible_distribution_major_version' from source: facts 7554 1726853167.23467: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853167.23634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853167.24077: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853167.24081: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853167.24084: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853167.24086: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853167.24103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853167.24133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853167.24163: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853167.24196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853167.24286: variable '__network_is_ostree' from source: set_fact 7554 1726853167.24298: Evaluated conditional (not __network_is_ostree is defined): False 7554 1726853167.24311: when evaluation is False, skipping this task 7554 1726853167.24319: _execute() done 7554 1726853167.24327: dumping result to json 7554 1726853167.24334: done dumping result, returning 7554 1726853167.24421: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-bdc3-98b6-000000000d49] 7554 1726853167.24425: sending task result for task 02083763-bbaf-bdc3-98b6-000000000d49 7554 1726853167.24490: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000d49 7554 1726853167.24495: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7554 1726853167.24569: no more pending results, returning what we have 7554 1726853167.24576: results queue empty 7554 1726853167.24577: checking for any_errors_fatal 7554 1726853167.24583: done checking for any_errors_fatal 7554 1726853167.24584: checking for max_fail_percentage 7554 1726853167.24586: done checking for max_fail_percentage 7554 1726853167.24586: checking to see if all hosts have failed and the running result is not ok 7554 1726853167.24588: done checking to see if all hosts have failed 7554 1726853167.24589: getting the remaining hosts for this loop 7554 1726853167.24590: done getting the remaining hosts for this loop 7554 1726853167.24594: getting the next task for host managed_node3 7554 1726853167.24605: done getting next task for host managed_node3 7554 1726853167.24609: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7554 1726853167.24614: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853167.24635: getting variables 7554 1726853167.24637: in VariableManager get_vars() 7554 1726853167.24691: Calling all_inventory to load vars for managed_node3 7554 1726853167.24694: Calling groups_inventory to load vars for managed_node3 7554 1726853167.24696: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853167.24707: Calling all_plugins_play to load vars for managed_node3 7554 1726853167.24710: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853167.24714: Calling groups_plugins_play to load vars for managed_node3 7554 1726853167.26320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853167.27784: done with get_vars() 7554 1726853167.27807: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:26:07 -0400 (0:00:00.058) 0:00:21.246 ****** 7554 1726853167.27905: entering _queue_task() for managed_node3/service_facts 7554 1726853167.28222: worker is 1 (out of 1 available) 7554 1726853167.28236: exiting _queue_task() for managed_node3/service_facts 7554 1726853167.28248: done queuing things up, now waiting for results queue to drain 7554 1726853167.28250: waiting for pending results... 7554 1726853167.28611: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7554 1726853167.28719: in run() - task 02083763-bbaf-bdc3-98b6-000000000d4b 7554 1726853167.28740: variable 'ansible_search_path' from source: unknown 7554 1726853167.28749: variable 'ansible_search_path' from source: unknown 7554 1726853167.28791: calling self._execute() 7554 1726853167.28923: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853167.28926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853167.28930: variable 'omit' from source: magic vars 7554 1726853167.29282: variable 'ansible_distribution_major_version' from source: facts 7554 1726853167.29298: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853167.29308: variable 'omit' from source: magic vars 7554 1726853167.29386: variable 'omit' from source: magic vars 7554 1726853167.29466: variable 'omit' from source: magic vars 7554 1726853167.29469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853167.29507: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853167.29531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853167.29553: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853167.29570: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853167.29607: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853167.29615: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853167.29687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853167.29730: Set connection var ansible_shell_executable to /bin/sh 7554 1726853167.29744: Set connection var ansible_pipelining to False 7554 1726853167.29751: Set connection var ansible_shell_type to sh 7554 1726853167.29756: Set connection var ansible_connection to ssh 7554 1726853167.29769: Set connection var ansible_timeout to 10 7554 1726853167.29780: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853167.29809: variable 'ansible_shell_executable' from source: unknown 7554 1726853167.29816: variable 'ansible_connection' from source: unknown 7554 1726853167.29824: variable 'ansible_module_compression' from source: unknown 7554 1726853167.29830: variable 'ansible_shell_type' from source: unknown 7554 1726853167.29836: variable 'ansible_shell_executable' from source: unknown 7554 1726853167.29841: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853167.29848: variable 'ansible_pipelining' from source: unknown 7554 1726853167.29855: variable 'ansible_timeout' from source: unknown 7554 1726853167.29861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853167.30053: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853167.30120: variable 'omit' from source: magic vars 7554 1726853167.30123: starting attempt loop 7554 1726853167.30126: running the handler 7554 1726853167.30128: _low_level_execute_command(): starting 7554 1726853167.30130: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853167.30808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853167.30823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853167.30837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853167.30860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853167.30886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853167.30964: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853167.30993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853167.31010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853167.31031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853167.31151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853167.32884: stdout chunk (state=3): >>>/root <<< 7554 1726853167.33060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853167.33109: stdout chunk (state=3): >>><<< 7554 1726853167.33123: stderr chunk (state=3): >>><<< 7554 1726853167.33319: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853167.33322: _low_level_execute_command(): starting 7554 1726853167.33325: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853167.3322556-8387-54241481138360 `" && echo ansible-tmp-1726853167.3322556-8387-54241481138360="` echo /root/.ansible/tmp/ansible-tmp-1726853167.3322556-8387-54241481138360 `" ) && sleep 0' 7554 1726853167.34435: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853167.34454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853167.34514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853167.34619: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853167.34667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853167.34724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853167.36915: stdout chunk (state=3): >>>ansible-tmp-1726853167.3322556-8387-54241481138360=/root/.ansible/tmp/ansible-tmp-1726853167.3322556-8387-54241481138360 <<< 7554 1726853167.36969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853167.36985: stdout chunk (state=3): >>><<< 7554 1726853167.36998: stderr chunk (state=3): >>><<< 7554 1726853167.37017: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853167.3322556-8387-54241481138360=/root/.ansible/tmp/ansible-tmp-1726853167.3322556-8387-54241481138360 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853167.37146: variable 'ansible_module_compression' from source: unknown 7554 1726853167.37278: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 7554 1726853167.37335: variable 'ansible_facts' from source: unknown 7554 1726853167.37680: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853167.3322556-8387-54241481138360/AnsiballZ_service_facts.py 7554 1726853167.37804: Sending initial data 7554 1726853167.37813: Sent initial data (159 bytes) 7554 1726853167.39012: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853167.39110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853167.39298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853167.39326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853167.39427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853167.41118: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 7554 1726853167.41184: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853167.41280: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853167.41398: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpl1fehii6 /root/.ansible/tmp/ansible-tmp-1726853167.3322556-8387-54241481138360/AnsiballZ_service_facts.py <<< 7554 1726853167.41410: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853167.3322556-8387-54241481138360/AnsiballZ_service_facts.py" <<< 7554 1726853167.41500: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpl1fehii6" to remote "/root/.ansible/tmp/ansible-tmp-1726853167.3322556-8387-54241481138360/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853167.3322556-8387-54241481138360/AnsiballZ_service_facts.py" <<< 7554 1726853167.42889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853167.42964: stderr chunk (state=3): >>><<< 7554 1726853167.42968: stdout chunk (state=3): >>><<< 7554 1726853167.42988: done transferring module to remote 7554 1726853167.43004: _low_level_execute_command(): starting 7554 1726853167.43038: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853167.3322556-8387-54241481138360/ /root/.ansible/tmp/ansible-tmp-1726853167.3322556-8387-54241481138360/AnsiballZ_service_facts.py && sleep 0' 7554 1726853167.44323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853167.44326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853167.44329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853167.44334: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853167.44346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853167.44578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853167.44613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853167.44682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853167.46589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853167.46646: stderr chunk (state=3): >>><<< 7554 1726853167.46649: stdout chunk (state=3): >>><<< 7554 1726853167.46815: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853167.46819: _low_level_execute_command(): starting 7554 1726853167.46825: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853167.3322556-8387-54241481138360/AnsiballZ_service_facts.py && sleep 0' 7554 1726853167.47989: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853167.48140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853169.08550: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 7554 1726853169.08626: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_sta<<< 7554 1726853169.08684: stdout chunk (state=3): >>>t.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7554 1726853169.10254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853169.10263: stdout chunk (state=3): >>><<< 7554 1726853169.10277: stderr chunk (state=3): >>><<< 7554 1726853169.10309: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853169.11320: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853167.3322556-8387-54241481138360/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853169.11336: _low_level_execute_command(): starting 7554 1726853169.11416: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853167.3322556-8387-54241481138360/ > /dev/null 2>&1 && sleep 0' 7554 1726853169.11961: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853169.12000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853169.12093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853169.12125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853169.12177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853169.12194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853169.12302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853169.14235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853169.14254: stdout chunk (state=3): >>><<< 7554 1726853169.14270: stderr chunk (state=3): >>><<< 7554 1726853169.14293: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853169.14311: handler run complete 7554 1726853169.14592: variable 'ansible_facts' from source: unknown 7554 1726853169.14738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853169.15319: variable 'ansible_facts' from source: unknown 7554 1726853169.15498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853169.15977: attempt loop complete, returning result 7554 1726853169.15980: _execute() done 7554 1726853169.15982: dumping result to json 7554 1726853169.15984: done dumping result, returning 7554 1726853169.15986: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-bdc3-98b6-000000000d4b] 7554 1726853169.15988: sending task result for task 02083763-bbaf-bdc3-98b6-000000000d4b 7554 1726853169.17043: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000d4b 7554 1726853169.17046: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853169.17174: no more pending results, returning what we have 7554 1726853169.17177: results queue empty 7554 1726853169.17178: checking for any_errors_fatal 7554 1726853169.17182: done checking for any_errors_fatal 7554 1726853169.17183: checking for max_fail_percentage 7554 1726853169.17184: done checking for max_fail_percentage 7554 1726853169.17185: checking to see if all hosts have failed and the running result is not ok 7554 1726853169.17186: done checking to see if all hosts have failed 7554 1726853169.17187: getting the remaining hosts for this loop 7554 1726853169.17188: done getting the remaining hosts for this loop 7554 1726853169.17192: getting the next task for host managed_node3 7554 1726853169.17197: done getting next task for host managed_node3 7554 1726853169.17201: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7554 1726853169.17205: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853169.17217: getting variables 7554 1726853169.17218: in VariableManager get_vars() 7554 1726853169.17258: Calling all_inventory to load vars for managed_node3 7554 1726853169.17261: Calling groups_inventory to load vars for managed_node3 7554 1726853169.17263: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853169.17276: Calling all_plugins_play to load vars for managed_node3 7554 1726853169.17279: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853169.17283: Calling groups_plugins_play to load vars for managed_node3 7554 1726853169.18508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853169.20152: done with get_vars() 7554 1726853169.20178: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:26:09 -0400 (0:00:01.923) 0:00:23.170 ****** 7554 1726853169.20283: entering _queue_task() for managed_node3/package_facts 7554 1726853169.20625: worker is 1 (out of 1 available) 7554 1726853169.20878: exiting _queue_task() for managed_node3/package_facts 7554 1726853169.20888: done queuing things up, now waiting for results queue to drain 7554 1726853169.20890: waiting for pending results... 7554 1726853169.21027: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7554 1726853169.21137: in run() - task 02083763-bbaf-bdc3-98b6-000000000d4c 7554 1726853169.21160: variable 'ansible_search_path' from source: unknown 7554 1726853169.21167: variable 'ansible_search_path' from source: unknown 7554 1726853169.21205: calling self._execute() 7554 1726853169.21310: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853169.21322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853169.21345: variable 'omit' from source: magic vars 7554 1726853169.21778: variable 'ansible_distribution_major_version' from source: facts 7554 1726853169.21783: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853169.21786: variable 'omit' from source: magic vars 7554 1726853169.21856: variable 'omit' from source: magic vars 7554 1726853169.21903: variable 'omit' from source: magic vars 7554 1726853169.21989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853169.21999: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853169.22022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853169.22048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853169.22063: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853169.22107: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853169.22115: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853169.22176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853169.22239: Set connection var ansible_shell_executable to /bin/sh 7554 1726853169.22256: Set connection var ansible_pipelining to False 7554 1726853169.22262: Set connection var ansible_shell_type to sh 7554 1726853169.22268: Set connection var ansible_connection to ssh 7554 1726853169.22284: Set connection var ansible_timeout to 10 7554 1726853169.22293: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853169.22325: variable 'ansible_shell_executable' from source: unknown 7554 1726853169.22333: variable 'ansible_connection' from source: unknown 7554 1726853169.22340: variable 'ansible_module_compression' from source: unknown 7554 1726853169.22349: variable 'ansible_shell_type' from source: unknown 7554 1726853169.22356: variable 'ansible_shell_executable' from source: unknown 7554 1726853169.22423: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853169.22429: variable 'ansible_pipelining' from source: unknown 7554 1726853169.22432: variable 'ansible_timeout' from source: unknown 7554 1726853169.22434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853169.22587: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853169.22604: variable 'omit' from source: magic vars 7554 1726853169.22613: starting attempt loop 7554 1726853169.22620: running the handler 7554 1726853169.22649: _low_level_execute_command(): starting 7554 1726853169.22662: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853169.23425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853169.23489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853169.23564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853169.23589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853169.23621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853169.23708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853169.25407: stdout chunk (state=3): >>>/root <<< 7554 1726853169.25576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853169.25580: stdout chunk (state=3): >>><<< 7554 1726853169.25582: stderr chunk (state=3): >>><<< 7554 1726853169.25605: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853169.25719: _low_level_execute_command(): starting 7554 1726853169.25723: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853169.2561932-8450-21365490383483 `" && echo ansible-tmp-1726853169.2561932-8450-21365490383483="` echo /root/.ansible/tmp/ansible-tmp-1726853169.2561932-8450-21365490383483 `" ) && sleep 0' 7554 1726853169.26335: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853169.26351: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853169.26369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853169.26394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853169.26448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853169.26541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853169.26558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853169.26582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853169.26686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853169.28707: stdout chunk (state=3): >>>ansible-tmp-1726853169.2561932-8450-21365490383483=/root/.ansible/tmp/ansible-tmp-1726853169.2561932-8450-21365490383483 <<< 7554 1726853169.28893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853169.28897: stdout chunk (state=3): >>><<< 7554 1726853169.28900: stderr chunk (state=3): >>><<< 7554 1726853169.29078: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853169.2561932-8450-21365490383483=/root/.ansible/tmp/ansible-tmp-1726853169.2561932-8450-21365490383483 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853169.29082: variable 'ansible_module_compression' from source: unknown 7554 1726853169.29084: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 7554 1726853169.29125: variable 'ansible_facts' from source: unknown 7554 1726853169.29341: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853169.2561932-8450-21365490383483/AnsiballZ_package_facts.py 7554 1726853169.29555: Sending initial data 7554 1726853169.29558: Sent initial data (159 bytes) 7554 1726853169.30190: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853169.30254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853169.30279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853169.30289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853169.30385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853169.32046: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853169.32138: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853169.32227: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpuvcc03xs /root/.ansible/tmp/ansible-tmp-1726853169.2561932-8450-21365490383483/AnsiballZ_package_facts.py <<< 7554 1726853169.32230: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853169.2561932-8450-21365490383483/AnsiballZ_package_facts.py" <<< 7554 1726853169.32286: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpuvcc03xs" to remote "/root/.ansible/tmp/ansible-tmp-1726853169.2561932-8450-21365490383483/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853169.2561932-8450-21365490383483/AnsiballZ_package_facts.py" <<< 7554 1726853169.34308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853169.34311: stderr chunk (state=3): >>><<< 7554 1726853169.34314: stdout chunk (state=3): >>><<< 7554 1726853169.34316: done transferring module to remote 7554 1726853169.34318: _low_level_execute_command(): starting 7554 1726853169.34320: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853169.2561932-8450-21365490383483/ /root/.ansible/tmp/ansible-tmp-1726853169.2561932-8450-21365490383483/AnsiballZ_package_facts.py && sleep 0' 7554 1726853169.34894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853169.34910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853169.34923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853169.34947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853169.34964: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853169.35054: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853169.35083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853169.35162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853169.37093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853169.37152: stderr chunk (state=3): >>><<< 7554 1726853169.37163: stdout chunk (state=3): >>><<< 7554 1726853169.37188: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853169.37198: _low_level_execute_command(): starting 7554 1726853169.37299: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853169.2561932-8450-21365490383483/AnsiballZ_package_facts.py && sleep 0' 7554 1726853169.37864: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853169.37882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853169.37896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853169.37919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853169.37935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853169.37949: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853169.38035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853169.38062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853169.38081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853169.38100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853169.38201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853169.83508: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 7554 1726853169.83522: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 7554 1726853169.83563: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 7554 1726853169.83601: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 7554 1726853169.83611: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 7554 1726853169.83642: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 7554 1726853169.83647: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 7554 1726853169.83652: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 7554 1726853169.83676: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 7554 1726853169.83686: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 7554 1726853169.83708: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7554 1726853169.85583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853169.85608: stderr chunk (state=3): >>><<< 7554 1726853169.85611: stdout chunk (state=3): >>><<< 7554 1726853169.85655: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853169.86924: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853169.2561932-8450-21365490383483/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853169.86940: _low_level_execute_command(): starting 7554 1726853169.86945: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853169.2561932-8450-21365490383483/ > /dev/null 2>&1 && sleep 0' 7554 1726853169.87406: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853169.87409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853169.87411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853169.87416: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853169.87419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853169.87477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853169.87482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853169.87484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853169.87548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853169.89456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853169.89483: stderr chunk (state=3): >>><<< 7554 1726853169.89486: stdout chunk (state=3): >>><<< 7554 1726853169.89499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853169.89504: handler run complete 7554 1726853169.89964: variable 'ansible_facts' from source: unknown 7554 1726853169.90275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853169.91316: variable 'ansible_facts' from source: unknown 7554 1726853169.91554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853169.91933: attempt loop complete, returning result 7554 1726853169.91941: _execute() done 7554 1726853169.91947: dumping result to json 7554 1726853169.92064: done dumping result, returning 7554 1726853169.92067: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-bdc3-98b6-000000000d4c] 7554 1726853169.92075: sending task result for task 02083763-bbaf-bdc3-98b6-000000000d4c 7554 1726853169.93388: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000d4c 7554 1726853169.93392: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853169.93479: no more pending results, returning what we have 7554 1726853169.93481: results queue empty 7554 1726853169.93482: checking for any_errors_fatal 7554 1726853169.93485: done checking for any_errors_fatal 7554 1726853169.93486: checking for max_fail_percentage 7554 1726853169.93487: done checking for max_fail_percentage 7554 1726853169.93487: checking to see if all hosts have failed and the running result is not ok 7554 1726853169.93488: done checking to see if all hosts have failed 7554 1726853169.93488: getting the remaining hosts for this loop 7554 1726853169.93489: done getting the remaining hosts for this loop 7554 1726853169.93491: getting the next task for host managed_node3 7554 1726853169.93496: done getting next task for host managed_node3 7554 1726853169.93499: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7554 1726853169.93501: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853169.93516: getting variables 7554 1726853169.93517: in VariableManager get_vars() 7554 1726853169.93545: Calling all_inventory to load vars for managed_node3 7554 1726853169.93546: Calling groups_inventory to load vars for managed_node3 7554 1726853169.93548: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853169.93554: Calling all_plugins_play to load vars for managed_node3 7554 1726853169.93556: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853169.93558: Calling groups_plugins_play to load vars for managed_node3 7554 1726853169.94216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853169.95061: done with get_vars() 7554 1726853169.95081: done getting variables 7554 1726853169.95123: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:26:09 -0400 (0:00:00.748) 0:00:23.919 ****** 7554 1726853169.95155: entering _queue_task() for managed_node3/debug 7554 1726853169.95392: worker is 1 (out of 1 available) 7554 1726853169.95407: exiting _queue_task() for managed_node3/debug 7554 1726853169.95419: done queuing things up, now waiting for results queue to drain 7554 1726853169.95421: waiting for pending results... 7554 1726853169.95625: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7554 1726853169.95723: in run() - task 02083763-bbaf-bdc3-98b6-00000000006a 7554 1726853169.95736: variable 'ansible_search_path' from source: unknown 7554 1726853169.95740: variable 'ansible_search_path' from source: unknown 7554 1726853169.95770: calling self._execute() 7554 1726853169.95847: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853169.95852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853169.95862: variable 'omit' from source: magic vars 7554 1726853169.96135: variable 'ansible_distribution_major_version' from source: facts 7554 1726853169.96144: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853169.96153: variable 'omit' from source: magic vars 7554 1726853169.96193: variable 'omit' from source: magic vars 7554 1726853169.96264: variable 'network_provider' from source: set_fact 7554 1726853169.96279: variable 'omit' from source: magic vars 7554 1726853169.96312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853169.96338: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853169.96362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853169.96374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853169.96384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853169.96409: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853169.96412: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853169.96414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853169.96486: Set connection var ansible_shell_executable to /bin/sh 7554 1726853169.96493: Set connection var ansible_pipelining to False 7554 1726853169.96496: Set connection var ansible_shell_type to sh 7554 1726853169.96499: Set connection var ansible_connection to ssh 7554 1726853169.96509: Set connection var ansible_timeout to 10 7554 1726853169.96511: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853169.96528: variable 'ansible_shell_executable' from source: unknown 7554 1726853169.96531: variable 'ansible_connection' from source: unknown 7554 1726853169.96533: variable 'ansible_module_compression' from source: unknown 7554 1726853169.96536: variable 'ansible_shell_type' from source: unknown 7554 1726853169.96538: variable 'ansible_shell_executable' from source: unknown 7554 1726853169.96540: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853169.96546: variable 'ansible_pipelining' from source: unknown 7554 1726853169.96549: variable 'ansible_timeout' from source: unknown 7554 1726853169.96553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853169.96655: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853169.96664: variable 'omit' from source: magic vars 7554 1726853169.96668: starting attempt loop 7554 1726853169.96673: running the handler 7554 1726853169.96709: handler run complete 7554 1726853169.96720: attempt loop complete, returning result 7554 1726853169.96724: _execute() done 7554 1726853169.96727: dumping result to json 7554 1726853169.96730: done dumping result, returning 7554 1726853169.96735: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-bdc3-98b6-00000000006a] 7554 1726853169.96740: sending task result for task 02083763-bbaf-bdc3-98b6-00000000006a 7554 1726853169.96819: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000006a 7554 1726853169.96821: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 7554 1726853169.96896: no more pending results, returning what we have 7554 1726853169.96899: results queue empty 7554 1726853169.96899: checking for any_errors_fatal 7554 1726853169.96910: done checking for any_errors_fatal 7554 1726853169.96911: checking for max_fail_percentage 7554 1726853169.96912: done checking for max_fail_percentage 7554 1726853169.96913: checking to see if all hosts have failed and the running result is not ok 7554 1726853169.96914: done checking to see if all hosts have failed 7554 1726853169.96914: getting the remaining hosts for this loop 7554 1726853169.96915: done getting the remaining hosts for this loop 7554 1726853169.96919: getting the next task for host managed_node3 7554 1726853169.96925: done getting next task for host managed_node3 7554 1726853169.96928: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7554 1726853169.96933: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853169.96943: getting variables 7554 1726853169.96945: in VariableManager get_vars() 7554 1726853169.96984: Calling all_inventory to load vars for managed_node3 7554 1726853169.96987: Calling groups_inventory to load vars for managed_node3 7554 1726853169.96989: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853169.96998: Calling all_plugins_play to load vars for managed_node3 7554 1726853169.97000: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853169.97003: Calling groups_plugins_play to load vars for managed_node3 7554 1726853169.97794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853169.98635: done with get_vars() 7554 1726853169.98651: done getting variables 7554 1726853169.98695: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:26:09 -0400 (0:00:00.035) 0:00:23.954 ****** 7554 1726853169.98719: entering _queue_task() for managed_node3/fail 7554 1726853169.98924: worker is 1 (out of 1 available) 7554 1726853169.98938: exiting _queue_task() for managed_node3/fail 7554 1726853169.98949: done queuing things up, now waiting for results queue to drain 7554 1726853169.98951: waiting for pending results... 7554 1726853169.99135: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7554 1726853169.99222: in run() - task 02083763-bbaf-bdc3-98b6-00000000006b 7554 1726853169.99234: variable 'ansible_search_path' from source: unknown 7554 1726853169.99237: variable 'ansible_search_path' from source: unknown 7554 1726853169.99266: calling self._execute() 7554 1726853169.99338: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853169.99342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853169.99354: variable 'omit' from source: magic vars 7554 1726853169.99627: variable 'ansible_distribution_major_version' from source: facts 7554 1726853169.99636: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853169.99725: variable 'network_state' from source: role '' defaults 7554 1726853169.99732: Evaluated conditional (network_state != {}): False 7554 1726853169.99736: when evaluation is False, skipping this task 7554 1726853169.99738: _execute() done 7554 1726853169.99741: dumping result to json 7554 1726853169.99743: done dumping result, returning 7554 1726853169.99752: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-bdc3-98b6-00000000006b] 7554 1726853169.99758: sending task result for task 02083763-bbaf-bdc3-98b6-00000000006b 7554 1726853169.99843: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000006b 7554 1726853169.99846: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853169.99892: no more pending results, returning what we have 7554 1726853169.99895: results queue empty 7554 1726853169.99896: checking for any_errors_fatal 7554 1726853169.99903: done checking for any_errors_fatal 7554 1726853169.99903: checking for max_fail_percentage 7554 1726853169.99905: done checking for max_fail_percentage 7554 1726853169.99906: checking to see if all hosts have failed and the running result is not ok 7554 1726853169.99906: done checking to see if all hosts have failed 7554 1726853169.99907: getting the remaining hosts for this loop 7554 1726853169.99908: done getting the remaining hosts for this loop 7554 1726853169.99912: getting the next task for host managed_node3 7554 1726853169.99917: done getting next task for host managed_node3 7554 1726853169.99921: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7554 1726853169.99923: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853169.99938: getting variables 7554 1726853169.99940: in VariableManager get_vars() 7554 1726853169.99979: Calling all_inventory to load vars for managed_node3 7554 1726853169.99981: Calling groups_inventory to load vars for managed_node3 7554 1726853169.99983: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853169.99991: Calling all_plugins_play to load vars for managed_node3 7554 1726853169.99993: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853169.99996: Calling groups_plugins_play to load vars for managed_node3 7554 1726853170.00717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853170.01573: done with get_vars() 7554 1726853170.01591: done getting variables 7554 1726853170.01632: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:26:10 -0400 (0:00:00.029) 0:00:23.984 ****** 7554 1726853170.01658: entering _queue_task() for managed_node3/fail 7554 1726853170.01889: worker is 1 (out of 1 available) 7554 1726853170.01905: exiting _queue_task() for managed_node3/fail 7554 1726853170.01918: done queuing things up, now waiting for results queue to drain 7554 1726853170.01919: waiting for pending results... 7554 1726853170.02112: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7554 1726853170.02205: in run() - task 02083763-bbaf-bdc3-98b6-00000000006c 7554 1726853170.02216: variable 'ansible_search_path' from source: unknown 7554 1726853170.02220: variable 'ansible_search_path' from source: unknown 7554 1726853170.02254: calling self._execute() 7554 1726853170.02329: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853170.02333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853170.02343: variable 'omit' from source: magic vars 7554 1726853170.02615: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.02625: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853170.02709: variable 'network_state' from source: role '' defaults 7554 1726853170.02717: Evaluated conditional (network_state != {}): False 7554 1726853170.02720: when evaluation is False, skipping this task 7554 1726853170.02723: _execute() done 7554 1726853170.02726: dumping result to json 7554 1726853170.02729: done dumping result, returning 7554 1726853170.02736: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-bdc3-98b6-00000000006c] 7554 1726853170.02741: sending task result for task 02083763-bbaf-bdc3-98b6-00000000006c 7554 1726853170.02828: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000006c 7554 1726853170.02831: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853170.02881: no more pending results, returning what we have 7554 1726853170.02885: results queue empty 7554 1726853170.02885: checking for any_errors_fatal 7554 1726853170.02894: done checking for any_errors_fatal 7554 1726853170.02895: checking for max_fail_percentage 7554 1726853170.02897: done checking for max_fail_percentage 7554 1726853170.02898: checking to see if all hosts have failed and the running result is not ok 7554 1726853170.02899: done checking to see if all hosts have failed 7554 1726853170.02899: getting the remaining hosts for this loop 7554 1726853170.02900: done getting the remaining hosts for this loop 7554 1726853170.02904: getting the next task for host managed_node3 7554 1726853170.02909: done getting next task for host managed_node3 7554 1726853170.02912: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7554 1726853170.02915: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853170.02931: getting variables 7554 1726853170.02933: in VariableManager get_vars() 7554 1726853170.02975: Calling all_inventory to load vars for managed_node3 7554 1726853170.02977: Calling groups_inventory to load vars for managed_node3 7554 1726853170.02979: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853170.02988: Calling all_plugins_play to load vars for managed_node3 7554 1726853170.02991: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853170.02993: Calling groups_plugins_play to load vars for managed_node3 7554 1726853170.07026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853170.07863: done with get_vars() 7554 1726853170.07880: done getting variables 7554 1726853170.07912: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:26:10 -0400 (0:00:00.062) 0:00:24.046 ****** 7554 1726853170.07930: entering _queue_task() for managed_node3/fail 7554 1726853170.08179: worker is 1 (out of 1 available) 7554 1726853170.08193: exiting _queue_task() for managed_node3/fail 7554 1726853170.08205: done queuing things up, now waiting for results queue to drain 7554 1726853170.08208: waiting for pending results... 7554 1726853170.08392: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7554 1726853170.08486: in run() - task 02083763-bbaf-bdc3-98b6-00000000006d 7554 1726853170.08498: variable 'ansible_search_path' from source: unknown 7554 1726853170.08501: variable 'ansible_search_path' from source: unknown 7554 1726853170.08530: calling self._execute() 7554 1726853170.08776: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853170.08780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853170.08784: variable 'omit' from source: magic vars 7554 1726853170.09154: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.09176: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853170.09363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853170.11761: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853170.11856: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853170.11902: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853170.11951: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853170.11988: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853170.12084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.12122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.12165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.12214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.12237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.12354: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.12391: Evaluated conditional (ansible_distribution_major_version | int > 9): True 7554 1726853170.12525: variable 'ansible_distribution' from source: facts 7554 1726853170.12536: variable '__network_rh_distros' from source: role '' defaults 7554 1726853170.12554: Evaluated conditional (ansible_distribution in __network_rh_distros): True 7554 1726853170.12841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.12875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.12915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.12964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.13027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.13053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.13084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.13135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.13167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.13190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.13236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.13262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.13282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.13305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.13316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.13511: variable 'network_connections' from source: task vars 7554 1726853170.13521: variable 'interface' from source: play vars 7554 1726853170.13572: variable 'interface' from source: play vars 7554 1726853170.13581: variable 'network_state' from source: role '' defaults 7554 1726853170.13625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853170.13751: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853170.13779: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853170.13804: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853170.13827: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853170.13859: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853170.13875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853170.13898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.13918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853170.13937: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 7554 1726853170.13940: when evaluation is False, skipping this task 7554 1726853170.13943: _execute() done 7554 1726853170.13948: dumping result to json 7554 1726853170.13950: done dumping result, returning 7554 1726853170.13958: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-bdc3-98b6-00000000006d] 7554 1726853170.13964: sending task result for task 02083763-bbaf-bdc3-98b6-00000000006d 7554 1726853170.14052: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000006d 7554 1726853170.14054: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 7554 1726853170.14105: no more pending results, returning what we have 7554 1726853170.14108: results queue empty 7554 1726853170.14109: checking for any_errors_fatal 7554 1726853170.14117: done checking for any_errors_fatal 7554 1726853170.14118: checking for max_fail_percentage 7554 1726853170.14119: done checking for max_fail_percentage 7554 1726853170.14120: checking to see if all hosts have failed and the running result is not ok 7554 1726853170.14121: done checking to see if all hosts have failed 7554 1726853170.14122: getting the remaining hosts for this loop 7554 1726853170.14123: done getting the remaining hosts for this loop 7554 1726853170.14126: getting the next task for host managed_node3 7554 1726853170.14133: done getting next task for host managed_node3 7554 1726853170.14136: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7554 1726853170.14139: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853170.14156: getting variables 7554 1726853170.14158: in VariableManager get_vars() 7554 1726853170.14214: Calling all_inventory to load vars for managed_node3 7554 1726853170.14217: Calling groups_inventory to load vars for managed_node3 7554 1726853170.14219: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853170.14228: Calling all_plugins_play to load vars for managed_node3 7554 1726853170.14231: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853170.14233: Calling groups_plugins_play to load vars for managed_node3 7554 1726853170.15013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853170.15964: done with get_vars() 7554 1726853170.15981: done getting variables 7554 1726853170.16024: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:26:10 -0400 (0:00:00.081) 0:00:24.128 ****** 7554 1726853170.16046: entering _queue_task() for managed_node3/dnf 7554 1726853170.16274: worker is 1 (out of 1 available) 7554 1726853170.16289: exiting _queue_task() for managed_node3/dnf 7554 1726853170.16302: done queuing things up, now waiting for results queue to drain 7554 1726853170.16304: waiting for pending results... 7554 1726853170.16484: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7554 1726853170.16576: in run() - task 02083763-bbaf-bdc3-98b6-00000000006e 7554 1726853170.16588: variable 'ansible_search_path' from source: unknown 7554 1726853170.16592: variable 'ansible_search_path' from source: unknown 7554 1726853170.16622: calling self._execute() 7554 1726853170.16703: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853170.16707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853170.16716: variable 'omit' from source: magic vars 7554 1726853170.16997: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.17006: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853170.17143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853170.18656: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853170.18709: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853170.18736: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853170.18764: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853170.18786: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853170.18843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.18865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.18885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.18910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.18924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.19006: variable 'ansible_distribution' from source: facts 7554 1726853170.19009: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.19025: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7554 1726853170.19105: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853170.19191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.19208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.19224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.19252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.19267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.19297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.19312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.19328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.19354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.19369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.19398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.19413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.19430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.19455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.19466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.19568: variable 'network_connections' from source: task vars 7554 1726853170.19583: variable 'interface' from source: play vars 7554 1726853170.19625: variable 'interface' from source: play vars 7554 1726853170.19676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853170.19795: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853170.19825: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853170.19848: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853170.19870: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853170.19904: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853170.19921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853170.19977: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.19981: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853170.19994: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853170.20148: variable 'network_connections' from source: task vars 7554 1726853170.20151: variable 'interface' from source: play vars 7554 1726853170.20195: variable 'interface' from source: play vars 7554 1726853170.20213: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7554 1726853170.20216: when evaluation is False, skipping this task 7554 1726853170.20220: _execute() done 7554 1726853170.20225: dumping result to json 7554 1726853170.20227: done dumping result, returning 7554 1726853170.20244: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-00000000006e] 7554 1726853170.20246: sending task result for task 02083763-bbaf-bdc3-98b6-00000000006e 7554 1726853170.20328: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000006e 7554 1726853170.20330: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7554 1726853170.20394: no more pending results, returning what we have 7554 1726853170.20397: results queue empty 7554 1726853170.20397: checking for any_errors_fatal 7554 1726853170.20403: done checking for any_errors_fatal 7554 1726853170.20404: checking for max_fail_percentage 7554 1726853170.20406: done checking for max_fail_percentage 7554 1726853170.20406: checking to see if all hosts have failed and the running result is not ok 7554 1726853170.20407: done checking to see if all hosts have failed 7554 1726853170.20408: getting the remaining hosts for this loop 7554 1726853170.20409: done getting the remaining hosts for this loop 7554 1726853170.20413: getting the next task for host managed_node3 7554 1726853170.20419: done getting next task for host managed_node3 7554 1726853170.20423: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7554 1726853170.20426: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853170.20446: getting variables 7554 1726853170.20448: in VariableManager get_vars() 7554 1726853170.20496: Calling all_inventory to load vars for managed_node3 7554 1726853170.20499: Calling groups_inventory to load vars for managed_node3 7554 1726853170.20501: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853170.20510: Calling all_plugins_play to load vars for managed_node3 7554 1726853170.20512: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853170.20514: Calling groups_plugins_play to load vars for managed_node3 7554 1726853170.21313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853170.22189: done with get_vars() 7554 1726853170.22207: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7554 1726853170.22263: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:26:10 -0400 (0:00:00.062) 0:00:24.190 ****** 7554 1726853170.22287: entering _queue_task() for managed_node3/yum 7554 1726853170.22530: worker is 1 (out of 1 available) 7554 1726853170.22548: exiting _queue_task() for managed_node3/yum 7554 1726853170.22560: done queuing things up, now waiting for results queue to drain 7554 1726853170.22561: waiting for pending results... 7554 1726853170.22748: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7554 1726853170.22844: in run() - task 02083763-bbaf-bdc3-98b6-00000000006f 7554 1726853170.22854: variable 'ansible_search_path' from source: unknown 7554 1726853170.22857: variable 'ansible_search_path' from source: unknown 7554 1726853170.22887: calling self._execute() 7554 1726853170.22964: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853170.22970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853170.22980: variable 'omit' from source: magic vars 7554 1726853170.23261: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.23273: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853170.23394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853170.24869: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853170.24920: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853170.24947: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853170.24975: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853170.24996: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853170.25052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.25079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.25097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.25123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.25133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.25213: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.25226: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7554 1726853170.25229: when evaluation is False, skipping this task 7554 1726853170.25232: _execute() done 7554 1726853170.25235: dumping result to json 7554 1726853170.25237: done dumping result, returning 7554 1726853170.25247: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-00000000006f] 7554 1726853170.25252: sending task result for task 02083763-bbaf-bdc3-98b6-00000000006f 7554 1726853170.25341: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000006f 7554 1726853170.25344: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7554 1726853170.25398: no more pending results, returning what we have 7554 1726853170.25401: results queue empty 7554 1726853170.25402: checking for any_errors_fatal 7554 1726853170.25407: done checking for any_errors_fatal 7554 1726853170.25408: checking for max_fail_percentage 7554 1726853170.25409: done checking for max_fail_percentage 7554 1726853170.25410: checking to see if all hosts have failed and the running result is not ok 7554 1726853170.25411: done checking to see if all hosts have failed 7554 1726853170.25412: getting the remaining hosts for this loop 7554 1726853170.25413: done getting the remaining hosts for this loop 7554 1726853170.25417: getting the next task for host managed_node3 7554 1726853170.25424: done getting next task for host managed_node3 7554 1726853170.25427: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7554 1726853170.25430: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853170.25447: getting variables 7554 1726853170.25449: in VariableManager get_vars() 7554 1726853170.25503: Calling all_inventory to load vars for managed_node3 7554 1726853170.25506: Calling groups_inventory to load vars for managed_node3 7554 1726853170.25508: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853170.25517: Calling all_plugins_play to load vars for managed_node3 7554 1726853170.25519: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853170.25521: Calling groups_plugins_play to load vars for managed_node3 7554 1726853170.26419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853170.27262: done with get_vars() 7554 1726853170.27279: done getting variables 7554 1726853170.27320: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:26:10 -0400 (0:00:00.050) 0:00:24.240 ****** 7554 1726853170.27343: entering _queue_task() for managed_node3/fail 7554 1726853170.27584: worker is 1 (out of 1 available) 7554 1726853170.27599: exiting _queue_task() for managed_node3/fail 7554 1726853170.27610: done queuing things up, now waiting for results queue to drain 7554 1726853170.27612: waiting for pending results... 7554 1726853170.27801: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7554 1726853170.27894: in run() - task 02083763-bbaf-bdc3-98b6-000000000070 7554 1726853170.27906: variable 'ansible_search_path' from source: unknown 7554 1726853170.27909: variable 'ansible_search_path' from source: unknown 7554 1726853170.27937: calling self._execute() 7554 1726853170.28023: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853170.28028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853170.28036: variable 'omit' from source: magic vars 7554 1726853170.28322: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.28332: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853170.28416: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853170.28545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853170.30031: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853170.30085: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853170.30117: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853170.30138: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853170.30161: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853170.30220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.30243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.30262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.30289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.30300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.30337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.30354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.30370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.30397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.30408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.30435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.30457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.30475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.30498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.30509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.30625: variable 'network_connections' from source: task vars 7554 1726853170.30636: variable 'interface' from source: play vars 7554 1726853170.30689: variable 'interface' from source: play vars 7554 1726853170.30737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853170.30849: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853170.30886: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853170.30909: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853170.30929: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853170.30961: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853170.30989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853170.31007: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.31025: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853170.31065: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853170.31217: variable 'network_connections' from source: task vars 7554 1726853170.31220: variable 'interface' from source: play vars 7554 1726853170.31266: variable 'interface' from source: play vars 7554 1726853170.31286: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7554 1726853170.31289: when evaluation is False, skipping this task 7554 1726853170.31292: _execute() done 7554 1726853170.31294: dumping result to json 7554 1726853170.31296: done dumping result, returning 7554 1726853170.31304: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-000000000070] 7554 1726853170.31310: sending task result for task 02083763-bbaf-bdc3-98b6-000000000070 7554 1726853170.31407: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000070 7554 1726853170.31409: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7554 1726853170.31477: no more pending results, returning what we have 7554 1726853170.31480: results queue empty 7554 1726853170.31481: checking for any_errors_fatal 7554 1726853170.31487: done checking for any_errors_fatal 7554 1726853170.31487: checking for max_fail_percentage 7554 1726853170.31489: done checking for max_fail_percentage 7554 1726853170.31490: checking to see if all hosts have failed and the running result is not ok 7554 1726853170.31491: done checking to see if all hosts have failed 7554 1726853170.31492: getting the remaining hosts for this loop 7554 1726853170.31493: done getting the remaining hosts for this loop 7554 1726853170.31496: getting the next task for host managed_node3 7554 1726853170.31502: done getting next task for host managed_node3 7554 1726853170.31506: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7554 1726853170.31509: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853170.31528: getting variables 7554 1726853170.31530: in VariableManager get_vars() 7554 1726853170.31578: Calling all_inventory to load vars for managed_node3 7554 1726853170.31580: Calling groups_inventory to load vars for managed_node3 7554 1726853170.31582: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853170.31591: Calling all_plugins_play to load vars for managed_node3 7554 1726853170.31593: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853170.31595: Calling groups_plugins_play to load vars for managed_node3 7554 1726853170.32380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853170.33248: done with get_vars() 7554 1726853170.33264: done getting variables 7554 1726853170.33310: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:26:10 -0400 (0:00:00.059) 0:00:24.300 ****** 7554 1726853170.33336: entering _queue_task() for managed_node3/package 7554 1726853170.33584: worker is 1 (out of 1 available) 7554 1726853170.33601: exiting _queue_task() for managed_node3/package 7554 1726853170.33613: done queuing things up, now waiting for results queue to drain 7554 1726853170.33615: waiting for pending results... 7554 1726853170.33796: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7554 1726853170.33890: in run() - task 02083763-bbaf-bdc3-98b6-000000000071 7554 1726853170.33902: variable 'ansible_search_path' from source: unknown 7554 1726853170.33906: variable 'ansible_search_path' from source: unknown 7554 1726853170.33934: calling self._execute() 7554 1726853170.34012: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853170.34017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853170.34027: variable 'omit' from source: magic vars 7554 1726853170.34304: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.34313: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853170.34447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853170.34635: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853170.34668: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853170.34693: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853170.34746: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853170.34833: variable 'network_packages' from source: role '' defaults 7554 1726853170.34904: variable '__network_provider_setup' from source: role '' defaults 7554 1726853170.34914: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853170.34962: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853170.34970: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853170.35013: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853170.35129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853170.36656: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853170.36701: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853170.36726: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853170.36749: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853170.36769: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853170.36828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.36847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.36865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.36975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.36977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.36979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.36980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.36981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.36989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.37002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.37142: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7554 1726853170.37217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.37246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.37262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.37288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.37298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.37361: variable 'ansible_python' from source: facts 7554 1726853170.37383: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7554 1726853170.37444: variable '__network_wpa_supplicant_required' from source: role '' defaults 7554 1726853170.37496: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7554 1726853170.37580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.37597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.37613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.37637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.37653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.37684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.37703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.37719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.37745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.37755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.37847: variable 'network_connections' from source: task vars 7554 1726853170.37851: variable 'interface' from source: play vars 7554 1726853170.37923: variable 'interface' from source: play vars 7554 1726853170.37972: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853170.37994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853170.38014: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.38034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853170.38069: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853170.38247: variable 'network_connections' from source: task vars 7554 1726853170.38251: variable 'interface' from source: play vars 7554 1726853170.38323: variable 'interface' from source: play vars 7554 1726853170.38345: variable '__network_packages_default_wireless' from source: role '' defaults 7554 1726853170.38403: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853170.38597: variable 'network_connections' from source: task vars 7554 1726853170.38600: variable 'interface' from source: play vars 7554 1726853170.38649: variable 'interface' from source: play vars 7554 1726853170.38665: variable '__network_packages_default_team' from source: role '' defaults 7554 1726853170.38718: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853170.38910: variable 'network_connections' from source: task vars 7554 1726853170.38913: variable 'interface' from source: play vars 7554 1726853170.38958: variable 'interface' from source: play vars 7554 1726853170.39003: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853170.39046: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853170.39050: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853170.39097: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853170.39238: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7554 1726853170.39538: variable 'network_connections' from source: task vars 7554 1726853170.39542: variable 'interface' from source: play vars 7554 1726853170.39587: variable 'interface' from source: play vars 7554 1726853170.39593: variable 'ansible_distribution' from source: facts 7554 1726853170.39596: variable '__network_rh_distros' from source: role '' defaults 7554 1726853170.39603: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.39617: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7554 1726853170.39724: variable 'ansible_distribution' from source: facts 7554 1726853170.39728: variable '__network_rh_distros' from source: role '' defaults 7554 1726853170.39730: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.39740: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7554 1726853170.39847: variable 'ansible_distribution' from source: facts 7554 1726853170.39851: variable '__network_rh_distros' from source: role '' defaults 7554 1726853170.39856: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.39882: variable 'network_provider' from source: set_fact 7554 1726853170.39894: variable 'ansible_facts' from source: unknown 7554 1726853170.40268: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7554 1726853170.40277: when evaluation is False, skipping this task 7554 1726853170.40279: _execute() done 7554 1726853170.40282: dumping result to json 7554 1726853170.40284: done dumping result, returning 7554 1726853170.40287: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-bdc3-98b6-000000000071] 7554 1726853170.40292: sending task result for task 02083763-bbaf-bdc3-98b6-000000000071 7554 1726853170.40382: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000071 7554 1726853170.40385: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7554 1726853170.40430: no more pending results, returning what we have 7554 1726853170.40433: results queue empty 7554 1726853170.40434: checking for any_errors_fatal 7554 1726853170.40442: done checking for any_errors_fatal 7554 1726853170.40442: checking for max_fail_percentage 7554 1726853170.40444: done checking for max_fail_percentage 7554 1726853170.40444: checking to see if all hosts have failed and the running result is not ok 7554 1726853170.40445: done checking to see if all hosts have failed 7554 1726853170.40446: getting the remaining hosts for this loop 7554 1726853170.40447: done getting the remaining hosts for this loop 7554 1726853170.40451: getting the next task for host managed_node3 7554 1726853170.40457: done getting next task for host managed_node3 7554 1726853170.40460: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7554 1726853170.40463: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853170.40482: getting variables 7554 1726853170.40483: in VariableManager get_vars() 7554 1726853170.40530: Calling all_inventory to load vars for managed_node3 7554 1726853170.40532: Calling groups_inventory to load vars for managed_node3 7554 1726853170.40535: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853170.40544: Calling all_plugins_play to load vars for managed_node3 7554 1726853170.40547: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853170.40549: Calling groups_plugins_play to load vars for managed_node3 7554 1726853170.41505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853170.42359: done with get_vars() 7554 1726853170.42376: done getting variables 7554 1726853170.42419: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:26:10 -0400 (0:00:00.091) 0:00:24.392 ****** 7554 1726853170.42447: entering _queue_task() for managed_node3/package 7554 1726853170.42693: worker is 1 (out of 1 available) 7554 1726853170.42708: exiting _queue_task() for managed_node3/package 7554 1726853170.42720: done queuing things up, now waiting for results queue to drain 7554 1726853170.42722: waiting for pending results... 7554 1726853170.42914: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7554 1726853170.43001: in run() - task 02083763-bbaf-bdc3-98b6-000000000072 7554 1726853170.43013: variable 'ansible_search_path' from source: unknown 7554 1726853170.43017: variable 'ansible_search_path' from source: unknown 7554 1726853170.43048: calling self._execute() 7554 1726853170.43127: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853170.43133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853170.43146: variable 'omit' from source: magic vars 7554 1726853170.43429: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.43439: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853170.43524: variable 'network_state' from source: role '' defaults 7554 1726853170.43533: Evaluated conditional (network_state != {}): False 7554 1726853170.43536: when evaluation is False, skipping this task 7554 1726853170.43539: _execute() done 7554 1726853170.43544: dumping result to json 7554 1726853170.43546: done dumping result, returning 7554 1726853170.43553: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-bdc3-98b6-000000000072] 7554 1726853170.43560: sending task result for task 02083763-bbaf-bdc3-98b6-000000000072 7554 1726853170.43651: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000072 7554 1726853170.43654: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853170.43699: no more pending results, returning what we have 7554 1726853170.43702: results queue empty 7554 1726853170.43702: checking for any_errors_fatal 7554 1726853170.43710: done checking for any_errors_fatal 7554 1726853170.43711: checking for max_fail_percentage 7554 1726853170.43712: done checking for max_fail_percentage 7554 1726853170.43714: checking to see if all hosts have failed and the running result is not ok 7554 1726853170.43715: done checking to see if all hosts have failed 7554 1726853170.43715: getting the remaining hosts for this loop 7554 1726853170.43717: done getting the remaining hosts for this loop 7554 1726853170.43720: getting the next task for host managed_node3 7554 1726853170.43726: done getting next task for host managed_node3 7554 1726853170.43730: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7554 1726853170.43733: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853170.43755: getting variables 7554 1726853170.43757: in VariableManager get_vars() 7554 1726853170.43801: Calling all_inventory to load vars for managed_node3 7554 1726853170.43803: Calling groups_inventory to load vars for managed_node3 7554 1726853170.43805: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853170.43814: Calling all_plugins_play to load vars for managed_node3 7554 1726853170.43817: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853170.43820: Calling groups_plugins_play to load vars for managed_node3 7554 1726853170.44593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853170.45459: done with get_vars() 7554 1726853170.45476: done getting variables 7554 1726853170.45520: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:26:10 -0400 (0:00:00.030) 0:00:24.423 ****** 7554 1726853170.45547: entering _queue_task() for managed_node3/package 7554 1726853170.45785: worker is 1 (out of 1 available) 7554 1726853170.45798: exiting _queue_task() for managed_node3/package 7554 1726853170.45809: done queuing things up, now waiting for results queue to drain 7554 1726853170.45811: waiting for pending results... 7554 1726853170.45999: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7554 1726853170.46088: in run() - task 02083763-bbaf-bdc3-98b6-000000000073 7554 1726853170.46100: variable 'ansible_search_path' from source: unknown 7554 1726853170.46104: variable 'ansible_search_path' from source: unknown 7554 1726853170.46132: calling self._execute() 7554 1726853170.46207: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853170.46212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853170.46223: variable 'omit' from source: magic vars 7554 1726853170.46493: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.46502: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853170.46588: variable 'network_state' from source: role '' defaults 7554 1726853170.46592: Evaluated conditional (network_state != {}): False 7554 1726853170.46595: when evaluation is False, skipping this task 7554 1726853170.46598: _execute() done 7554 1726853170.46600: dumping result to json 7554 1726853170.46603: done dumping result, returning 7554 1726853170.46612: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-bdc3-98b6-000000000073] 7554 1726853170.46617: sending task result for task 02083763-bbaf-bdc3-98b6-000000000073 7554 1726853170.46709: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000073 7554 1726853170.46713: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853170.46762: no more pending results, returning what we have 7554 1726853170.46765: results queue empty 7554 1726853170.46766: checking for any_errors_fatal 7554 1726853170.46773: done checking for any_errors_fatal 7554 1726853170.46774: checking for max_fail_percentage 7554 1726853170.46775: done checking for max_fail_percentage 7554 1726853170.46777: checking to see if all hosts have failed and the running result is not ok 7554 1726853170.46778: done checking to see if all hosts have failed 7554 1726853170.46778: getting the remaining hosts for this loop 7554 1726853170.46780: done getting the remaining hosts for this loop 7554 1726853170.46783: getting the next task for host managed_node3 7554 1726853170.46789: done getting next task for host managed_node3 7554 1726853170.46792: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7554 1726853170.46795: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853170.46811: getting variables 7554 1726853170.46812: in VariableManager get_vars() 7554 1726853170.46855: Calling all_inventory to load vars for managed_node3 7554 1726853170.46858: Calling groups_inventory to load vars for managed_node3 7554 1726853170.46860: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853170.46868: Calling all_plugins_play to load vars for managed_node3 7554 1726853170.46877: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853170.46881: Calling groups_plugins_play to load vars for managed_node3 7554 1726853170.47715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853170.48566: done with get_vars() 7554 1726853170.48583: done getting variables 7554 1726853170.48627: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:26:10 -0400 (0:00:00.031) 0:00:24.454 ****** 7554 1726853170.48652: entering _queue_task() for managed_node3/service 7554 1726853170.48874: worker is 1 (out of 1 available) 7554 1726853170.48890: exiting _queue_task() for managed_node3/service 7554 1726853170.48901: done queuing things up, now waiting for results queue to drain 7554 1726853170.48903: waiting for pending results... 7554 1726853170.49083: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7554 1726853170.49170: in run() - task 02083763-bbaf-bdc3-98b6-000000000074 7554 1726853170.49183: variable 'ansible_search_path' from source: unknown 7554 1726853170.49186: variable 'ansible_search_path' from source: unknown 7554 1726853170.49214: calling self._execute() 7554 1726853170.49292: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853170.49296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853170.49306: variable 'omit' from source: magic vars 7554 1726853170.49586: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.49595: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853170.49682: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853170.49811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853170.51477: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853170.51481: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853170.51501: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853170.51539: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853170.51575: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853170.51657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.51702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.51730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.51778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.51798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.51853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.51885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.51914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.51959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.51981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.52025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.52059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.52084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.52116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.52136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.52264: variable 'network_connections' from source: task vars 7554 1726853170.52286: variable 'interface' from source: play vars 7554 1726853170.52330: variable 'interface' from source: play vars 7554 1726853170.52385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853170.52499: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853170.52538: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853170.52561: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853170.52583: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853170.52612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853170.52632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853170.52652: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.52669: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853170.52709: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853170.52867: variable 'network_connections' from source: task vars 7554 1726853170.52873: variable 'interface' from source: play vars 7554 1726853170.52916: variable 'interface' from source: play vars 7554 1726853170.52934: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7554 1726853170.52937: when evaluation is False, skipping this task 7554 1726853170.52940: _execute() done 7554 1726853170.52952: dumping result to json 7554 1726853170.52955: done dumping result, returning 7554 1726853170.52958: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-000000000074] 7554 1726853170.52963: sending task result for task 02083763-bbaf-bdc3-98b6-000000000074 7554 1726853170.53049: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000074 7554 1726853170.53061: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7554 1726853170.53111: no more pending results, returning what we have 7554 1726853170.53114: results queue empty 7554 1726853170.53115: checking for any_errors_fatal 7554 1726853170.53119: done checking for any_errors_fatal 7554 1726853170.53120: checking for max_fail_percentage 7554 1726853170.53122: done checking for max_fail_percentage 7554 1726853170.53123: checking to see if all hosts have failed and the running result is not ok 7554 1726853170.53124: done checking to see if all hosts have failed 7554 1726853170.53124: getting the remaining hosts for this loop 7554 1726853170.53126: done getting the remaining hosts for this loop 7554 1726853170.53129: getting the next task for host managed_node3 7554 1726853170.53135: done getting next task for host managed_node3 7554 1726853170.53139: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7554 1726853170.53142: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853170.53159: getting variables 7554 1726853170.53161: in VariableManager get_vars() 7554 1726853170.53210: Calling all_inventory to load vars for managed_node3 7554 1726853170.53213: Calling groups_inventory to load vars for managed_node3 7554 1726853170.53215: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853170.53224: Calling all_plugins_play to load vars for managed_node3 7554 1726853170.53226: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853170.53228: Calling groups_plugins_play to load vars for managed_node3 7554 1726853170.54020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853170.55882: done with get_vars() 7554 1726853170.55908: done getting variables 7554 1726853170.55970: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:26:10 -0400 (0:00:00.073) 0:00:24.527 ****** 7554 1726853170.56003: entering _queue_task() for managed_node3/service 7554 1726853170.56393: worker is 1 (out of 1 available) 7554 1726853170.56406: exiting _queue_task() for managed_node3/service 7554 1726853170.56417: done queuing things up, now waiting for results queue to drain 7554 1726853170.56419: waiting for pending results... 7554 1726853170.56790: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7554 1726853170.56830: in run() - task 02083763-bbaf-bdc3-98b6-000000000075 7554 1726853170.56854: variable 'ansible_search_path' from source: unknown 7554 1726853170.56861: variable 'ansible_search_path' from source: unknown 7554 1726853170.56907: calling self._execute() 7554 1726853170.57017: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853170.57030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853170.57045: variable 'omit' from source: magic vars 7554 1726853170.57440: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.57460: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853170.57636: variable 'network_provider' from source: set_fact 7554 1726853170.57652: variable 'network_state' from source: role '' defaults 7554 1726853170.57750: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7554 1726853170.57753: variable 'omit' from source: magic vars 7554 1726853170.57756: variable 'omit' from source: magic vars 7554 1726853170.57776: variable 'network_service_name' from source: role '' defaults 7554 1726853170.57846: variable 'network_service_name' from source: role '' defaults 7554 1726853170.57966: variable '__network_provider_setup' from source: role '' defaults 7554 1726853170.57982: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853170.58040: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853170.58057: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853170.58124: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853170.58403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853170.60690: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853170.60976: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853170.60980: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853170.60982: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853170.60983: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853170.60986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.60988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.61018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.61061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.61079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.61129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.61156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.61185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.61230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.61249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.61483: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7554 1726853170.61603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.61632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.61674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.61716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.61734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.61839: variable 'ansible_python' from source: facts 7554 1726853170.61879: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7554 1726853170.61977: variable '__network_wpa_supplicant_required' from source: role '' defaults 7554 1726853170.62078: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7554 1726853170.62205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.62236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.62295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.62320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.62340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.62393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853170.62523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853170.62526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.62529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853170.62531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853170.62736: variable 'network_connections' from source: task vars 7554 1726853170.62740: variable 'interface' from source: play vars 7554 1726853170.62780: variable 'interface' from source: play vars 7554 1726853170.62903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853170.63124: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853170.63189: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853170.63236: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853170.63292: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853170.63357: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853170.63401: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853170.63496: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853170.63500: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853170.63525: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853170.63811: variable 'network_connections' from source: task vars 7554 1726853170.63830: variable 'interface' from source: play vars 7554 1726853170.63909: variable 'interface' from source: play vars 7554 1726853170.63958: variable '__network_packages_default_wireless' from source: role '' defaults 7554 1726853170.64045: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853170.64357: variable 'network_connections' from source: task vars 7554 1726853170.64374: variable 'interface' from source: play vars 7554 1726853170.64576: variable 'interface' from source: play vars 7554 1726853170.64580: variable '__network_packages_default_team' from source: role '' defaults 7554 1726853170.64582: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853170.64867: variable 'network_connections' from source: task vars 7554 1726853170.64880: variable 'interface' from source: play vars 7554 1726853170.64962: variable 'interface' from source: play vars 7554 1726853170.65023: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853170.65095: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853170.65106: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853170.65178: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853170.65407: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7554 1726853170.65931: variable 'network_connections' from source: task vars 7554 1726853170.65944: variable 'interface' from source: play vars 7554 1726853170.66014: variable 'interface' from source: play vars 7554 1726853170.66025: variable 'ansible_distribution' from source: facts 7554 1726853170.66033: variable '__network_rh_distros' from source: role '' defaults 7554 1726853170.66046: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.66062: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7554 1726853170.66331: variable 'ansible_distribution' from source: facts 7554 1726853170.66334: variable '__network_rh_distros' from source: role '' defaults 7554 1726853170.66336: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.66338: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7554 1726853170.66466: variable 'ansible_distribution' from source: facts 7554 1726853170.66477: variable '__network_rh_distros' from source: role '' defaults 7554 1726853170.66487: variable 'ansible_distribution_major_version' from source: facts 7554 1726853170.66523: variable 'network_provider' from source: set_fact 7554 1726853170.66560: variable 'omit' from source: magic vars 7554 1726853170.66593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853170.66626: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853170.66658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853170.66684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853170.66698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853170.66729: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853170.66736: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853170.66746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853170.66879: Set connection var ansible_shell_executable to /bin/sh 7554 1726853170.66883: Set connection var ansible_pipelining to False 7554 1726853170.66885: Set connection var ansible_shell_type to sh 7554 1726853170.66887: Set connection var ansible_connection to ssh 7554 1726853170.66900: Set connection var ansible_timeout to 10 7554 1726853170.66910: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853170.66976: variable 'ansible_shell_executable' from source: unknown 7554 1726853170.66979: variable 'ansible_connection' from source: unknown 7554 1726853170.66990: variable 'ansible_module_compression' from source: unknown 7554 1726853170.66993: variable 'ansible_shell_type' from source: unknown 7554 1726853170.66995: variable 'ansible_shell_executable' from source: unknown 7554 1726853170.66997: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853170.66998: variable 'ansible_pipelining' from source: unknown 7554 1726853170.67000: variable 'ansible_timeout' from source: unknown 7554 1726853170.67002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853170.67097: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853170.67112: variable 'omit' from source: magic vars 7554 1726853170.67207: starting attempt loop 7554 1726853170.67210: running the handler 7554 1726853170.67219: variable 'ansible_facts' from source: unknown 7554 1726853170.68064: _low_level_execute_command(): starting 7554 1726853170.68086: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853170.68833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853170.68888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853170.68903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853170.68957: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853170.69002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853170.69019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853170.69080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853170.69152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853170.70869: stdout chunk (state=3): >>>/root <<< 7554 1726853170.71086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853170.71090: stdout chunk (state=3): >>><<< 7554 1726853170.71092: stderr chunk (state=3): >>><<< 7554 1726853170.71095: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853170.71098: _low_level_execute_command(): starting 7554 1726853170.71100: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853170.7103953-8487-166394534661833 `" && echo ansible-tmp-1726853170.7103953-8487-166394534661833="` echo /root/.ansible/tmp/ansible-tmp-1726853170.7103953-8487-166394534661833 `" ) && sleep 0' 7554 1726853170.71637: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853170.71644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853170.71653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853170.71668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853170.71687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853170.71694: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853170.71704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853170.71717: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853170.71725: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853170.71731: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7554 1726853170.71747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853170.71750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853170.71853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853170.71857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853170.71859: stderr chunk (state=3): >>>debug2: match found <<< 7554 1726853170.71861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853170.71863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853170.71865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853170.71899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853170.71965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853170.73992: stdout chunk (state=3): >>>ansible-tmp-1726853170.7103953-8487-166394534661833=/root/.ansible/tmp/ansible-tmp-1726853170.7103953-8487-166394534661833 <<< 7554 1726853170.74277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853170.74281: stderr chunk (state=3): >>><<< 7554 1726853170.74284: stdout chunk (state=3): >>><<< 7554 1726853170.74287: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853170.7103953-8487-166394534661833=/root/.ansible/tmp/ansible-tmp-1726853170.7103953-8487-166394534661833 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853170.74289: variable 'ansible_module_compression' from source: unknown 7554 1726853170.74291: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 7554 1726853170.74332: variable 'ansible_facts' from source: unknown 7554 1726853170.74525: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853170.7103953-8487-166394534661833/AnsiballZ_systemd.py 7554 1726853170.74694: Sending initial data 7554 1726853170.74698: Sent initial data (154 bytes) 7554 1726853170.75299: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853170.75310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853170.75387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853170.75418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853170.75429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853170.75450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853170.75544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853170.77199: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7554 1726853170.77226: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853170.77291: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853170.77355: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpz8gydjhs /root/.ansible/tmp/ansible-tmp-1726853170.7103953-8487-166394534661833/AnsiballZ_systemd.py <<< 7554 1726853170.77358: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853170.7103953-8487-166394534661833/AnsiballZ_systemd.py" <<< 7554 1726853170.77417: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpz8gydjhs" to remote "/root/.ansible/tmp/ansible-tmp-1726853170.7103953-8487-166394534661833/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853170.7103953-8487-166394534661833/AnsiballZ_systemd.py" <<< 7554 1726853170.79183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853170.79187: stdout chunk (state=3): >>><<< 7554 1726853170.79189: stderr chunk (state=3): >>><<< 7554 1726853170.79191: done transferring module to remote 7554 1726853170.79194: _low_level_execute_command(): starting 7554 1726853170.79196: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853170.7103953-8487-166394534661833/ /root/.ansible/tmp/ansible-tmp-1726853170.7103953-8487-166394534661833/AnsiballZ_systemd.py && sleep 0' 7554 1726853170.79802: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853170.79810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853170.79821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853170.79835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853170.79851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853170.79859: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853170.79868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853170.79884: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853170.79945: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853170.79948: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7554 1726853170.79950: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853170.79952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853170.79954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853170.79956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853170.79958: stderr chunk (state=3): >>>debug2: match found <<< 7554 1726853170.79959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853170.80054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853170.80057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853170.80288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853170.80366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853170.82419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853170.82422: stdout chunk (state=3): >>><<< 7554 1726853170.82425: stderr chunk (state=3): >>><<< 7554 1726853170.82428: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853170.82430: _low_level_execute_command(): starting 7554 1726853170.82433: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853170.7103953-8487-166394534661833/AnsiballZ_systemd.py && sleep 0' 7554 1726853170.83061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853170.83107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853170.83185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853171.12888: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9502720", "MemoryPeak": "10031104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3326943232", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "122045000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", <<< 7554 1726853171.12896: stdout chunk (state=3): >>>"MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service multi-user.target cl<<< 7554 1726853171.12913: stdout chunk (state=3): >>>oud-init.service network.target", "After": "cloud-init-local.service network-pre.target dbus.socket system.slice sysinit.target dbus-broker.service basic.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:21:20 EDT", "StateChangeTimestampMonotonic": "24855925", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7554 1726853171.15078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853171.15082: stdout chunk (state=3): >>><<< 7554 1726853171.15085: stderr chunk (state=3): >>><<< 7554 1726853171.15088: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9502720", "MemoryPeak": "10031104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3326943232", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "122045000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service multi-user.target cloud-init.service network.target", "After": "cloud-init-local.service network-pre.target dbus.socket system.slice sysinit.target dbus-broker.service basic.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:21:20 EDT", "StateChangeTimestampMonotonic": "24855925", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853171.15158: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853170.7103953-8487-166394534661833/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853171.15180: _low_level_execute_command(): starting 7554 1726853171.15194: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853170.7103953-8487-166394534661833/ > /dev/null 2>&1 && sleep 0' 7554 1726853171.15833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853171.15886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853171.15962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853171.15990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853171.16018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853171.16095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853171.17952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853171.17974: stderr chunk (state=3): >>><<< 7554 1726853171.17978: stdout chunk (state=3): >>><<< 7554 1726853171.17991: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853171.17999: handler run complete 7554 1726853171.18041: attempt loop complete, returning result 7554 1726853171.18044: _execute() done 7554 1726853171.18047: dumping result to json 7554 1726853171.18060: done dumping result, returning 7554 1726853171.18068: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-bdc3-98b6-000000000075] 7554 1726853171.18075: sending task result for task 02083763-bbaf-bdc3-98b6-000000000075 7554 1726853171.18299: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000075 7554 1726853171.18301: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853171.18355: no more pending results, returning what we have 7554 1726853171.18358: results queue empty 7554 1726853171.18359: checking for any_errors_fatal 7554 1726853171.18365: done checking for any_errors_fatal 7554 1726853171.18366: checking for max_fail_percentage 7554 1726853171.18367: done checking for max_fail_percentage 7554 1726853171.18368: checking to see if all hosts have failed and the running result is not ok 7554 1726853171.18369: done checking to see if all hosts have failed 7554 1726853171.18369: getting the remaining hosts for this loop 7554 1726853171.18373: done getting the remaining hosts for this loop 7554 1726853171.18376: getting the next task for host managed_node3 7554 1726853171.18382: done getting next task for host managed_node3 7554 1726853171.18385: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7554 1726853171.18388: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853171.18398: getting variables 7554 1726853171.18400: in VariableManager get_vars() 7554 1726853171.18447: Calling all_inventory to load vars for managed_node3 7554 1726853171.18450: Calling groups_inventory to load vars for managed_node3 7554 1726853171.18452: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853171.18461: Calling all_plugins_play to load vars for managed_node3 7554 1726853171.18463: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853171.18466: Calling groups_plugins_play to load vars for managed_node3 7554 1726853171.19244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853171.20205: done with get_vars() 7554 1726853171.20224: done getting variables 7554 1726853171.20267: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:26:11 -0400 (0:00:00.642) 0:00:25.170 ****** 7554 1726853171.20295: entering _queue_task() for managed_node3/service 7554 1726853171.20533: worker is 1 (out of 1 available) 7554 1726853171.20545: exiting _queue_task() for managed_node3/service 7554 1726853171.20557: done queuing things up, now waiting for results queue to drain 7554 1726853171.20559: waiting for pending results... 7554 1726853171.20743: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7554 1726853171.20837: in run() - task 02083763-bbaf-bdc3-98b6-000000000076 7554 1726853171.20853: variable 'ansible_search_path' from source: unknown 7554 1726853171.20857: variable 'ansible_search_path' from source: unknown 7554 1726853171.20885: calling self._execute() 7554 1726853171.20964: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853171.20968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853171.20979: variable 'omit' from source: magic vars 7554 1726853171.21476: variable 'ansible_distribution_major_version' from source: facts 7554 1726853171.21480: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853171.21483: variable 'network_provider' from source: set_fact 7554 1726853171.21486: Evaluated conditional (network_provider == "nm"): True 7554 1726853171.21567: variable '__network_wpa_supplicant_required' from source: role '' defaults 7554 1726853171.21660: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7554 1726853171.21834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853171.23823: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853171.23867: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853171.23902: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853171.23923: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853171.23942: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853171.24010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853171.24027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853171.24044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853171.24073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853171.24084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853171.24118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853171.24134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853171.24153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853171.24179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853171.24189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853171.24219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853171.24237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853171.24255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853171.24280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853171.24290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853171.24388: variable 'network_connections' from source: task vars 7554 1726853171.24399: variable 'interface' from source: play vars 7554 1726853171.24451: variable 'interface' from source: play vars 7554 1726853171.24502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853171.24624: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853171.24653: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853171.24678: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853171.24875: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853171.24878: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853171.24880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853171.24882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853171.24884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853171.24886: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853171.25115: variable 'network_connections' from source: task vars 7554 1726853171.25125: variable 'interface' from source: play vars 7554 1726853171.25192: variable 'interface' from source: play vars 7554 1726853171.25225: Evaluated conditional (__network_wpa_supplicant_required): False 7554 1726853171.25233: when evaluation is False, skipping this task 7554 1726853171.25243: _execute() done 7554 1726853171.25251: dumping result to json 7554 1726853171.25258: done dumping result, returning 7554 1726853171.25269: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-bdc3-98b6-000000000076] 7554 1726853171.25291: sending task result for task 02083763-bbaf-bdc3-98b6-000000000076 7554 1726853171.25399: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000076 7554 1726853171.25406: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7554 1726853171.25454: no more pending results, returning what we have 7554 1726853171.25458: results queue empty 7554 1726853171.25458: checking for any_errors_fatal 7554 1726853171.25679: done checking for any_errors_fatal 7554 1726853171.25680: checking for max_fail_percentage 7554 1726853171.25682: done checking for max_fail_percentage 7554 1726853171.25683: checking to see if all hosts have failed and the running result is not ok 7554 1726853171.25684: done checking to see if all hosts have failed 7554 1726853171.25684: getting the remaining hosts for this loop 7554 1726853171.25686: done getting the remaining hosts for this loop 7554 1726853171.25689: getting the next task for host managed_node3 7554 1726853171.25695: done getting next task for host managed_node3 7554 1726853171.25698: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7554 1726853171.25701: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853171.25716: getting variables 7554 1726853171.25718: in VariableManager get_vars() 7554 1726853171.25760: Calling all_inventory to load vars for managed_node3 7554 1726853171.25769: Calling groups_inventory to load vars for managed_node3 7554 1726853171.25775: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853171.25784: Calling all_plugins_play to load vars for managed_node3 7554 1726853171.25786: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853171.25789: Calling groups_plugins_play to load vars for managed_node3 7554 1726853171.27518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853171.29116: done with get_vars() 7554 1726853171.29146: done getting variables 7554 1726853171.29205: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:26:11 -0400 (0:00:00.089) 0:00:25.259 ****** 7554 1726853171.29241: entering _queue_task() for managed_node3/service 7554 1726853171.29784: worker is 1 (out of 1 available) 7554 1726853171.29794: exiting _queue_task() for managed_node3/service 7554 1726853171.29804: done queuing things up, now waiting for results queue to drain 7554 1726853171.29805: waiting for pending results... 7554 1726853171.29899: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7554 1726853171.30142: in run() - task 02083763-bbaf-bdc3-98b6-000000000077 7554 1726853171.30146: variable 'ansible_search_path' from source: unknown 7554 1726853171.30149: variable 'ansible_search_path' from source: unknown 7554 1726853171.30151: calling self._execute() 7554 1726853171.30231: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853171.30248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853171.30263: variable 'omit' from source: magic vars 7554 1726853171.30644: variable 'ansible_distribution_major_version' from source: facts 7554 1726853171.30660: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853171.30767: variable 'network_provider' from source: set_fact 7554 1726853171.30779: Evaluated conditional (network_provider == "initscripts"): False 7554 1726853171.30792: when evaluation is False, skipping this task 7554 1726853171.30804: _execute() done 7554 1726853171.30812: dumping result to json 7554 1726853171.30820: done dumping result, returning 7554 1726853171.30831: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-bdc3-98b6-000000000077] 7554 1726853171.30842: sending task result for task 02083763-bbaf-bdc3-98b6-000000000077 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853171.31120: no more pending results, returning what we have 7554 1726853171.31125: results queue empty 7554 1726853171.31126: checking for any_errors_fatal 7554 1726853171.31136: done checking for any_errors_fatal 7554 1726853171.31137: checking for max_fail_percentage 7554 1726853171.31139: done checking for max_fail_percentage 7554 1726853171.31140: checking to see if all hosts have failed and the running result is not ok 7554 1726853171.31142: done checking to see if all hosts have failed 7554 1726853171.31142: getting the remaining hosts for this loop 7554 1726853171.31144: done getting the remaining hosts for this loop 7554 1726853171.31148: getting the next task for host managed_node3 7554 1726853171.31156: done getting next task for host managed_node3 7554 1726853171.31161: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7554 1726853171.31165: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853171.31192: getting variables 7554 1726853171.31194: in VariableManager get_vars() 7554 1726853171.31251: Calling all_inventory to load vars for managed_node3 7554 1726853171.31254: Calling groups_inventory to load vars for managed_node3 7554 1726853171.31257: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853171.31270: Calling all_plugins_play to load vars for managed_node3 7554 1726853171.31380: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853171.31390: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000077 7554 1726853171.31393: WORKER PROCESS EXITING 7554 1726853171.31398: Calling groups_plugins_play to load vars for managed_node3 7554 1726853171.32883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853171.34613: done with get_vars() 7554 1726853171.34640: done getting variables 7554 1726853171.34705: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:26:11 -0400 (0:00:00.055) 0:00:25.314 ****** 7554 1726853171.34740: entering _queue_task() for managed_node3/copy 7554 1726853171.35173: worker is 1 (out of 1 available) 7554 1726853171.35297: exiting _queue_task() for managed_node3/copy 7554 1726853171.35308: done queuing things up, now waiting for results queue to drain 7554 1726853171.35310: waiting for pending results... 7554 1726853171.35496: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7554 1726853171.35654: in run() - task 02083763-bbaf-bdc3-98b6-000000000078 7554 1726853171.35729: variable 'ansible_search_path' from source: unknown 7554 1726853171.35732: variable 'ansible_search_path' from source: unknown 7554 1726853171.35735: calling self._execute() 7554 1726853171.35828: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853171.35846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853171.35867: variable 'omit' from source: magic vars 7554 1726853171.36249: variable 'ansible_distribution_major_version' from source: facts 7554 1726853171.36271: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853171.36394: variable 'network_provider' from source: set_fact 7554 1726853171.36410: Evaluated conditional (network_provider == "initscripts"): False 7554 1726853171.36487: when evaluation is False, skipping this task 7554 1726853171.36490: _execute() done 7554 1726853171.36493: dumping result to json 7554 1726853171.36495: done dumping result, returning 7554 1726853171.36498: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-bdc3-98b6-000000000078] 7554 1726853171.36500: sending task result for task 02083763-bbaf-bdc3-98b6-000000000078 7554 1726853171.36573: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000078 7554 1726853171.36577: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7554 1726853171.36630: no more pending results, returning what we have 7554 1726853171.36634: results queue empty 7554 1726853171.36635: checking for any_errors_fatal 7554 1726853171.36640: done checking for any_errors_fatal 7554 1726853171.36641: checking for max_fail_percentage 7554 1726853171.36643: done checking for max_fail_percentage 7554 1726853171.36644: checking to see if all hosts have failed and the running result is not ok 7554 1726853171.36645: done checking to see if all hosts have failed 7554 1726853171.36645: getting the remaining hosts for this loop 7554 1726853171.36647: done getting the remaining hosts for this loop 7554 1726853171.36651: getting the next task for host managed_node3 7554 1726853171.36657: done getting next task for host managed_node3 7554 1726853171.36661: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7554 1726853171.36665: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853171.36691: getting variables 7554 1726853171.36693: in VariableManager get_vars() 7554 1726853171.36748: Calling all_inventory to load vars for managed_node3 7554 1726853171.36751: Calling groups_inventory to load vars for managed_node3 7554 1726853171.36753: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853171.36766: Calling all_plugins_play to load vars for managed_node3 7554 1726853171.36770: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853171.36988: Calling groups_plugins_play to load vars for managed_node3 7554 1726853171.38875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853171.39828: done with get_vars() 7554 1726853171.39847: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:26:11 -0400 (0:00:00.051) 0:00:25.366 ****** 7554 1726853171.39911: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7554 1726853171.40158: worker is 1 (out of 1 available) 7554 1726853171.40175: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7554 1726853171.40186: done queuing things up, now waiting for results queue to drain 7554 1726853171.40188: waiting for pending results... 7554 1726853171.40381: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7554 1726853171.40464: in run() - task 02083763-bbaf-bdc3-98b6-000000000079 7554 1726853171.40479: variable 'ansible_search_path' from source: unknown 7554 1726853171.40482: variable 'ansible_search_path' from source: unknown 7554 1726853171.40509: calling self._execute() 7554 1726853171.40589: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853171.40625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853171.40628: variable 'omit' from source: magic vars 7554 1726853171.41087: variable 'ansible_distribution_major_version' from source: facts 7554 1726853171.41091: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853171.41093: variable 'omit' from source: magic vars 7554 1726853171.41096: variable 'omit' from source: magic vars 7554 1726853171.41478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853171.44092: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853171.44160: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853171.44202: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853171.44243: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853171.44277: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853171.44478: variable 'network_provider' from source: set_fact 7554 1726853171.44500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853171.44544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853171.44573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853171.44616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853171.44635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853171.44712: variable 'omit' from source: magic vars 7554 1726853171.44835: variable 'omit' from source: magic vars 7554 1726853171.44943: variable 'network_connections' from source: task vars 7554 1726853171.44965: variable 'interface' from source: play vars 7554 1726853171.45029: variable 'interface' from source: play vars 7554 1726853171.45296: variable 'omit' from source: magic vars 7554 1726853171.45677: variable '__lsr_ansible_managed' from source: task vars 7554 1726853171.45680: variable '__lsr_ansible_managed' from source: task vars 7554 1726853171.46565: Loaded config def from plugin (lookup/template) 7554 1726853171.46579: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7554 1726853171.46614: File lookup term: get_ansible_managed.j2 7554 1726853171.46624: variable 'ansible_search_path' from source: unknown 7554 1726853171.46634: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7554 1726853171.46652: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7554 1726853171.46678: variable 'ansible_search_path' from source: unknown 7554 1726853171.53175: variable 'ansible_managed' from source: unknown 7554 1726853171.53300: variable 'omit' from source: magic vars 7554 1726853171.53333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853171.53366: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853171.53392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853171.53415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853171.53431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853171.53463: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853171.53474: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853171.53483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853171.53580: Set connection var ansible_shell_executable to /bin/sh 7554 1726853171.53594: Set connection var ansible_pipelining to False 7554 1726853171.53601: Set connection var ansible_shell_type to sh 7554 1726853171.53608: Set connection var ansible_connection to ssh 7554 1726853171.53623: Set connection var ansible_timeout to 10 7554 1726853171.53633: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853171.53659: variable 'ansible_shell_executable' from source: unknown 7554 1726853171.53668: variable 'ansible_connection' from source: unknown 7554 1726853171.53677: variable 'ansible_module_compression' from source: unknown 7554 1726853171.53685: variable 'ansible_shell_type' from source: unknown 7554 1726853171.53775: variable 'ansible_shell_executable' from source: unknown 7554 1726853171.53778: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853171.53780: variable 'ansible_pipelining' from source: unknown 7554 1726853171.53782: variable 'ansible_timeout' from source: unknown 7554 1726853171.53785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853171.53847: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853171.53872: variable 'omit' from source: magic vars 7554 1726853171.53884: starting attempt loop 7554 1726853171.53892: running the handler 7554 1726853171.53909: _low_level_execute_command(): starting 7554 1726853171.53920: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853171.54659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853171.54685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853171.54700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853171.54933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853171.56648: stdout chunk (state=3): >>>/root <<< 7554 1726853171.56820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853171.56846: stdout chunk (state=3): >>><<< 7554 1726853171.56850: stderr chunk (state=3): >>><<< 7554 1726853171.56868: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853171.56956: _low_level_execute_command(): starting 7554 1726853171.56960: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853171.5687613-8519-277787660611698 `" && echo ansible-tmp-1726853171.5687613-8519-277787660611698="` echo /root/.ansible/tmp/ansible-tmp-1726853171.5687613-8519-277787660611698 `" ) && sleep 0' 7554 1726853171.57427: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853171.57441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853171.57467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853171.57482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853171.57486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853171.57534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853171.57537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853171.57547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853171.57618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853171.59637: stdout chunk (state=3): >>>ansible-tmp-1726853171.5687613-8519-277787660611698=/root/.ansible/tmp/ansible-tmp-1726853171.5687613-8519-277787660611698 <<< 7554 1726853171.59793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853171.59798: stdout chunk (state=3): >>><<< 7554 1726853171.59800: stderr chunk (state=3): >>><<< 7554 1726853171.59960: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853171.5687613-8519-277787660611698=/root/.ansible/tmp/ansible-tmp-1726853171.5687613-8519-277787660611698 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853171.59964: variable 'ansible_module_compression' from source: unknown 7554 1726853171.59966: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 7554 1726853171.59969: variable 'ansible_facts' from source: unknown 7554 1726853171.60103: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853171.5687613-8519-277787660611698/AnsiballZ_network_connections.py 7554 1726853171.60200: Sending initial data 7554 1726853171.60210: Sent initial data (166 bytes) 7554 1726853171.60616: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853171.60649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853171.60656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853171.60658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853171.60661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853171.60705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853171.60709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853171.60776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853171.62467: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853171.62537: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853171.62602: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpl2tvxuma /root/.ansible/tmp/ansible-tmp-1726853171.5687613-8519-277787660611698/AnsiballZ_network_connections.py <<< 7554 1726853171.62622: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853171.5687613-8519-277787660611698/AnsiballZ_network_connections.py" <<< 7554 1726853171.62679: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpl2tvxuma" to remote "/root/.ansible/tmp/ansible-tmp-1726853171.5687613-8519-277787660611698/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853171.5687613-8519-277787660611698/AnsiballZ_network_connections.py" <<< 7554 1726853171.63944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853171.63947: stderr chunk (state=3): >>><<< 7554 1726853171.63949: stdout chunk (state=3): >>><<< 7554 1726853171.63950: done transferring module to remote 7554 1726853171.63962: _low_level_execute_command(): starting 7554 1726853171.63968: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853171.5687613-8519-277787660611698/ /root/.ansible/tmp/ansible-tmp-1726853171.5687613-8519-277787660611698/AnsiballZ_network_connections.py && sleep 0' 7554 1726853171.64364: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853171.64395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853171.64398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853171.64444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853171.64449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853171.64509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853171.66392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853171.66414: stderr chunk (state=3): >>><<< 7554 1726853171.66417: stdout chunk (state=3): >>><<< 7554 1726853171.66433: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853171.66436: _low_level_execute_command(): starting 7554 1726853171.66443: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853171.5687613-8519-277787660611698/AnsiballZ_network_connections.py && sleep 0' 7554 1726853171.67090: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853171.67106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853171.67121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853171.67210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853172.02295: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mobw4bdt/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mobw4bdt/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/3c89e0d2-18d0-4f1d-897d-821d98d74a63: error=unknown <<< 7554 1726853172.02508: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7554 1726853172.04411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853172.04436: stderr chunk (state=3): >>><<< 7554 1726853172.04440: stdout chunk (state=3): >>><<< 7554 1726853172.04457: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mobw4bdt/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mobw4bdt/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/3c89e0d2-18d0-4f1d-897d-821d98d74a63: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853172.04487: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853171.5687613-8519-277787660611698/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853172.04502: _low_level_execute_command(): starting 7554 1726853172.04506: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853171.5687613-8519-277787660611698/ > /dev/null 2>&1 && sleep 0' 7554 1726853172.04934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853172.04937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853172.04939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853172.04942: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853172.04944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853172.04999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853172.05004: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853172.05007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853172.05063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853172.06984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853172.06988: stdout chunk (state=3): >>><<< 7554 1726853172.06990: stderr chunk (state=3): >>><<< 7554 1726853172.07177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853172.07181: handler run complete 7554 1726853172.07183: attempt loop complete, returning result 7554 1726853172.07185: _execute() done 7554 1726853172.07187: dumping result to json 7554 1726853172.07189: done dumping result, returning 7554 1726853172.07196: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-bdc3-98b6-000000000079] 7554 1726853172.07199: sending task result for task 02083763-bbaf-bdc3-98b6-000000000079 7554 1726853172.07270: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000079 changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 7554 1726853172.07383: no more pending results, returning what we have 7554 1726853172.07387: results queue empty 7554 1726853172.07388: checking for any_errors_fatal 7554 1726853172.07394: done checking for any_errors_fatal 7554 1726853172.07394: checking for max_fail_percentage 7554 1726853172.07396: done checking for max_fail_percentage 7554 1726853172.07397: checking to see if all hosts have failed and the running result is not ok 7554 1726853172.07398: done checking to see if all hosts have failed 7554 1726853172.07399: getting the remaining hosts for this loop 7554 1726853172.07400: done getting the remaining hosts for this loop 7554 1726853172.07404: getting the next task for host managed_node3 7554 1726853172.07410: done getting next task for host managed_node3 7554 1726853172.07413: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7554 1726853172.07416: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853172.07427: getting variables 7554 1726853172.07429: in VariableManager get_vars() 7554 1726853172.07596: Calling all_inventory to load vars for managed_node3 7554 1726853172.07599: Calling groups_inventory to load vars for managed_node3 7554 1726853172.07602: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853172.07608: WORKER PROCESS EXITING 7554 1726853172.07617: Calling all_plugins_play to load vars for managed_node3 7554 1726853172.07621: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853172.07623: Calling groups_plugins_play to load vars for managed_node3 7554 1726853172.09386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853172.11048: done with get_vars() 7554 1726853172.11072: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:26:12 -0400 (0:00:00.712) 0:00:26.079 ****** 7554 1726853172.11167: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7554 1726853172.11704: worker is 1 (out of 1 available) 7554 1726853172.11714: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7554 1726853172.11724: done queuing things up, now waiting for results queue to drain 7554 1726853172.11726: waiting for pending results... 7554 1726853172.11834: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7554 1726853172.11967: in run() - task 02083763-bbaf-bdc3-98b6-00000000007a 7554 1726853172.11991: variable 'ansible_search_path' from source: unknown 7554 1726853172.11998: variable 'ansible_search_path' from source: unknown 7554 1726853172.12037: calling self._execute() 7554 1726853172.12150: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853172.12172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853172.12244: variable 'omit' from source: magic vars 7554 1726853172.12556: variable 'ansible_distribution_major_version' from source: facts 7554 1726853172.12576: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853172.12710: variable 'network_state' from source: role '' defaults 7554 1726853172.12730: Evaluated conditional (network_state != {}): False 7554 1726853172.12738: when evaluation is False, skipping this task 7554 1726853172.12747: _execute() done 7554 1726853172.12756: dumping result to json 7554 1726853172.12764: done dumping result, returning 7554 1726853172.12777: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-bdc3-98b6-00000000007a] 7554 1726853172.12790: sending task result for task 02083763-bbaf-bdc3-98b6-00000000007a 7554 1726853172.12908: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000007a 7554 1726853172.12910: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853172.12984: no more pending results, returning what we have 7554 1726853172.12988: results queue empty 7554 1726853172.12989: checking for any_errors_fatal 7554 1726853172.13001: done checking for any_errors_fatal 7554 1726853172.13002: checking for max_fail_percentage 7554 1726853172.13004: done checking for max_fail_percentage 7554 1726853172.13005: checking to see if all hosts have failed and the running result is not ok 7554 1726853172.13006: done checking to see if all hosts have failed 7554 1726853172.13007: getting the remaining hosts for this loop 7554 1726853172.13008: done getting the remaining hosts for this loop 7554 1726853172.13012: getting the next task for host managed_node3 7554 1726853172.13019: done getting next task for host managed_node3 7554 1726853172.13023: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7554 1726853172.13027: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853172.13053: getting variables 7554 1726853172.13055: in VariableManager get_vars() 7554 1726853172.13109: Calling all_inventory to load vars for managed_node3 7554 1726853172.13112: Calling groups_inventory to load vars for managed_node3 7554 1726853172.13115: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853172.13126: Calling all_plugins_play to load vars for managed_node3 7554 1726853172.13130: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853172.13133: Calling groups_plugins_play to load vars for managed_node3 7554 1726853172.14615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853172.16203: done with get_vars() 7554 1726853172.16225: done getting variables 7554 1726853172.16285: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:26:12 -0400 (0:00:00.051) 0:00:26.130 ****** 7554 1726853172.16319: entering _queue_task() for managed_node3/debug 7554 1726853172.16663: worker is 1 (out of 1 available) 7554 1726853172.16813: exiting _queue_task() for managed_node3/debug 7554 1726853172.16825: done queuing things up, now waiting for results queue to drain 7554 1726853172.16827: waiting for pending results... 7554 1726853172.17481: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7554 1726853172.17527: in run() - task 02083763-bbaf-bdc3-98b6-00000000007b 7554 1726853172.17598: variable 'ansible_search_path' from source: unknown 7554 1726853172.17781: variable 'ansible_search_path' from source: unknown 7554 1726853172.17786: calling self._execute() 7554 1726853172.17941: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853172.17955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853172.17972: variable 'omit' from source: magic vars 7554 1726853172.18784: variable 'ansible_distribution_major_version' from source: facts 7554 1726853172.18881: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853172.18894: variable 'omit' from source: magic vars 7554 1726853172.18963: variable 'omit' from source: magic vars 7554 1726853172.19063: variable 'omit' from source: magic vars 7554 1726853172.19176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853172.19212: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853172.19264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853172.19376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853172.19461: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853172.19464: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853172.19466: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853172.19468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853172.19633: Set connection var ansible_shell_executable to /bin/sh 7554 1726853172.19692: Set connection var ansible_pipelining to False 7554 1726853172.19698: Set connection var ansible_shell_type to sh 7554 1726853172.19704: Set connection var ansible_connection to ssh 7554 1726853172.19715: Set connection var ansible_timeout to 10 7554 1726853172.19723: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853172.19746: variable 'ansible_shell_executable' from source: unknown 7554 1726853172.19788: variable 'ansible_connection' from source: unknown 7554 1726853172.19897: variable 'ansible_module_compression' from source: unknown 7554 1726853172.19900: variable 'ansible_shell_type' from source: unknown 7554 1726853172.19902: variable 'ansible_shell_executable' from source: unknown 7554 1726853172.19905: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853172.19907: variable 'ansible_pipelining' from source: unknown 7554 1726853172.19909: variable 'ansible_timeout' from source: unknown 7554 1726853172.19914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853172.20334: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853172.20338: variable 'omit' from source: magic vars 7554 1726853172.20341: starting attempt loop 7554 1726853172.20343: running the handler 7554 1726853172.20443: variable '__network_connections_result' from source: set_fact 7554 1726853172.20498: handler run complete 7554 1726853172.20526: attempt loop complete, returning result 7554 1726853172.20534: _execute() done 7554 1726853172.20542: dumping result to json 7554 1726853172.20560: done dumping result, returning 7554 1726853172.20577: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-bdc3-98b6-00000000007b] 7554 1726853172.20589: sending task result for task 02083763-bbaf-bdc3-98b6-00000000007b ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 7554 1726853172.20863: no more pending results, returning what we have 7554 1726853172.20866: results queue empty 7554 1726853172.20867: checking for any_errors_fatal 7554 1726853172.20876: done checking for any_errors_fatal 7554 1726853172.20877: checking for max_fail_percentage 7554 1726853172.20878: done checking for max_fail_percentage 7554 1726853172.20880: checking to see if all hosts have failed and the running result is not ok 7554 1726853172.20881: done checking to see if all hosts have failed 7554 1726853172.20881: getting the remaining hosts for this loop 7554 1726853172.20883: done getting the remaining hosts for this loop 7554 1726853172.20887: getting the next task for host managed_node3 7554 1726853172.20894: done getting next task for host managed_node3 7554 1726853172.20898: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7554 1726853172.20901: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853172.20915: getting variables 7554 1726853172.20917: in VariableManager get_vars() 7554 1726853172.20968: Calling all_inventory to load vars for managed_node3 7554 1726853172.21087: Calling groups_inventory to load vars for managed_node3 7554 1726853172.21092: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853172.21101: Calling all_plugins_play to load vars for managed_node3 7554 1726853172.21105: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853172.21108: Calling groups_plugins_play to load vars for managed_node3 7554 1726853172.21707: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000007b 7554 1726853172.21711: WORKER PROCESS EXITING 7554 1726853172.22709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853172.25756: done with get_vars() 7554 1726853172.25903: done getting variables 7554 1726853172.25970: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:26:12 -0400 (0:00:00.096) 0:00:26.227 ****** 7554 1726853172.26013: entering _queue_task() for managed_node3/debug 7554 1726853172.26346: worker is 1 (out of 1 available) 7554 1726853172.26359: exiting _queue_task() for managed_node3/debug 7554 1726853172.26412: done queuing things up, now waiting for results queue to drain 7554 1726853172.26414: waiting for pending results... 7554 1726853172.26857: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7554 1726853172.27177: in run() - task 02083763-bbaf-bdc3-98b6-00000000007c 7554 1726853172.27195: variable 'ansible_search_path' from source: unknown 7554 1726853172.27199: variable 'ansible_search_path' from source: unknown 7554 1726853172.27229: calling self._execute() 7554 1726853172.27545: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853172.27554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853172.27565: variable 'omit' from source: magic vars 7554 1726853172.28257: variable 'ansible_distribution_major_version' from source: facts 7554 1726853172.28267: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853172.28274: variable 'omit' from source: magic vars 7554 1726853172.28433: variable 'omit' from source: magic vars 7554 1726853172.28476: variable 'omit' from source: magic vars 7554 1726853172.28518: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853172.28558: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853172.28606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853172.28619: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853172.28631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853172.28678: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853172.28681: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853172.28689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853172.28799: Set connection var ansible_shell_executable to /bin/sh 7554 1726853172.28807: Set connection var ansible_pipelining to False 7554 1726853172.28810: Set connection var ansible_shell_type to sh 7554 1726853172.28812: Set connection var ansible_connection to ssh 7554 1726853172.28822: Set connection var ansible_timeout to 10 7554 1726853172.28834: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853172.28851: variable 'ansible_shell_executable' from source: unknown 7554 1726853172.28854: variable 'ansible_connection' from source: unknown 7554 1726853172.28857: variable 'ansible_module_compression' from source: unknown 7554 1726853172.28859: variable 'ansible_shell_type' from source: unknown 7554 1726853172.28862: variable 'ansible_shell_executable' from source: unknown 7554 1726853172.28864: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853172.28942: variable 'ansible_pipelining' from source: unknown 7554 1726853172.28945: variable 'ansible_timeout' from source: unknown 7554 1726853172.28947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853172.29040: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853172.29055: variable 'omit' from source: magic vars 7554 1726853172.29063: starting attempt loop 7554 1726853172.29066: running the handler 7554 1726853172.29114: variable '__network_connections_result' from source: set_fact 7554 1726853172.29193: variable '__network_connections_result' from source: set_fact 7554 1726853172.29301: handler run complete 7554 1726853172.29351: attempt loop complete, returning result 7554 1726853172.29353: _execute() done 7554 1726853172.29356: dumping result to json 7554 1726853172.29358: done dumping result, returning 7554 1726853172.29360: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-bdc3-98b6-00000000007c] 7554 1726853172.29362: sending task result for task 02083763-bbaf-bdc3-98b6-00000000007c 7554 1726853172.29540: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000007c 7554 1726853172.29543: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 7554 1726853172.29616: no more pending results, returning what we have 7554 1726853172.29619: results queue empty 7554 1726853172.29619: checking for any_errors_fatal 7554 1726853172.29625: done checking for any_errors_fatal 7554 1726853172.29626: checking for max_fail_percentage 7554 1726853172.29627: done checking for max_fail_percentage 7554 1726853172.29628: checking to see if all hosts have failed and the running result is not ok 7554 1726853172.29629: done checking to see if all hosts have failed 7554 1726853172.29629: getting the remaining hosts for this loop 7554 1726853172.29631: done getting the remaining hosts for this loop 7554 1726853172.29633: getting the next task for host managed_node3 7554 1726853172.29638: done getting next task for host managed_node3 7554 1726853172.29642: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7554 1726853172.29644: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853172.29654: getting variables 7554 1726853172.29656: in VariableManager get_vars() 7554 1726853172.29696: Calling all_inventory to load vars for managed_node3 7554 1726853172.29699: Calling groups_inventory to load vars for managed_node3 7554 1726853172.29701: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853172.29710: Calling all_plugins_play to load vars for managed_node3 7554 1726853172.29712: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853172.29716: Calling groups_plugins_play to load vars for managed_node3 7554 1726853172.31174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853172.34807: done with get_vars() 7554 1726853172.34837: done getting variables 7554 1726853172.34899: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:26:12 -0400 (0:00:00.089) 0:00:26.316 ****** 7554 1726853172.34942: entering _queue_task() for managed_node3/debug 7554 1726853172.35261: worker is 1 (out of 1 available) 7554 1726853172.35478: exiting _queue_task() for managed_node3/debug 7554 1726853172.35488: done queuing things up, now waiting for results queue to drain 7554 1726853172.35490: waiting for pending results... 7554 1726853172.35689: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7554 1726853172.35787: in run() - task 02083763-bbaf-bdc3-98b6-00000000007d 7554 1726853172.35791: variable 'ansible_search_path' from source: unknown 7554 1726853172.35793: variable 'ansible_search_path' from source: unknown 7554 1726853172.35796: calling self._execute() 7554 1726853172.35877: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853172.35895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853172.35910: variable 'omit' from source: magic vars 7554 1726853172.36273: variable 'ansible_distribution_major_version' from source: facts 7554 1726853172.36290: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853172.36415: variable 'network_state' from source: role '' defaults 7554 1726853172.36433: Evaluated conditional (network_state != {}): False 7554 1726853172.36448: when evaluation is False, skipping this task 7554 1726853172.36456: _execute() done 7554 1726853172.36463: dumping result to json 7554 1726853172.36550: done dumping result, returning 7554 1726853172.36554: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-bdc3-98b6-00000000007d] 7554 1726853172.36557: sending task result for task 02083763-bbaf-bdc3-98b6-00000000007d 7554 1726853172.36621: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000007d 7554 1726853172.36624: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7554 1726853172.36700: no more pending results, returning what we have 7554 1726853172.36704: results queue empty 7554 1726853172.36705: checking for any_errors_fatal 7554 1726853172.36713: done checking for any_errors_fatal 7554 1726853172.36713: checking for max_fail_percentage 7554 1726853172.36715: done checking for max_fail_percentage 7554 1726853172.36716: checking to see if all hosts have failed and the running result is not ok 7554 1726853172.36717: done checking to see if all hosts have failed 7554 1726853172.36718: getting the remaining hosts for this loop 7554 1726853172.36720: done getting the remaining hosts for this loop 7554 1726853172.36723: getting the next task for host managed_node3 7554 1726853172.36729: done getting next task for host managed_node3 7554 1726853172.36733: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7554 1726853172.36736: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853172.36759: getting variables 7554 1726853172.36761: in VariableManager get_vars() 7554 1726853172.36812: Calling all_inventory to load vars for managed_node3 7554 1726853172.36815: Calling groups_inventory to load vars for managed_node3 7554 1726853172.36817: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853172.36829: Calling all_plugins_play to load vars for managed_node3 7554 1726853172.36832: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853172.36835: Calling groups_plugins_play to load vars for managed_node3 7554 1726853172.38434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853172.40956: done with get_vars() 7554 1726853172.41089: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:26:12 -0400 (0:00:00.062) 0:00:26.379 ****** 7554 1726853172.41195: entering _queue_task() for managed_node3/ping 7554 1726853172.41946: worker is 1 (out of 1 available) 7554 1726853172.41960: exiting _queue_task() for managed_node3/ping 7554 1726853172.41974: done queuing things up, now waiting for results queue to drain 7554 1726853172.41976: waiting for pending results... 7554 1726853172.42579: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7554 1726853172.42748: in run() - task 02083763-bbaf-bdc3-98b6-00000000007e 7554 1726853172.42761: variable 'ansible_search_path' from source: unknown 7554 1726853172.42765: variable 'ansible_search_path' from source: unknown 7554 1726853172.42805: calling self._execute() 7554 1726853172.43012: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853172.43132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853172.43135: variable 'omit' from source: magic vars 7554 1726853172.43872: variable 'ansible_distribution_major_version' from source: facts 7554 1726853172.43916: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853172.43922: variable 'omit' from source: magic vars 7554 1726853172.44126: variable 'omit' from source: magic vars 7554 1726853172.44166: variable 'omit' from source: magic vars 7554 1726853172.44206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853172.44357: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853172.44384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853172.44399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853172.44410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853172.44440: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853172.44682: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853172.44685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853172.44788: Set connection var ansible_shell_executable to /bin/sh 7554 1726853172.44797: Set connection var ansible_pipelining to False 7554 1726853172.44800: Set connection var ansible_shell_type to sh 7554 1726853172.44802: Set connection var ansible_connection to ssh 7554 1726853172.44812: Set connection var ansible_timeout to 10 7554 1726853172.44817: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853172.44840: variable 'ansible_shell_executable' from source: unknown 7554 1726853172.44846: variable 'ansible_connection' from source: unknown 7554 1726853172.44849: variable 'ansible_module_compression' from source: unknown 7554 1726853172.44851: variable 'ansible_shell_type' from source: unknown 7554 1726853172.44853: variable 'ansible_shell_executable' from source: unknown 7554 1726853172.44855: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853172.44857: variable 'ansible_pipelining' from source: unknown 7554 1726853172.44859: variable 'ansible_timeout' from source: unknown 7554 1726853172.44862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853172.45265: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853172.45277: variable 'omit' from source: magic vars 7554 1726853172.45283: starting attempt loop 7554 1726853172.45286: running the handler 7554 1726853172.45310: _low_level_execute_command(): starting 7554 1726853172.45376: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853172.47229: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853172.47248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853172.47362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853172.47396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853172.47586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853172.47780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853172.47786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853172.49401: stdout chunk (state=3): >>>/root <<< 7554 1726853172.49520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853172.49545: stderr chunk (state=3): >>><<< 7554 1726853172.49548: stdout chunk (state=3): >>><<< 7554 1726853172.49574: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853172.49587: _low_level_execute_command(): starting 7554 1726853172.49594: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853172.4957404-8556-193294778881928 `" && echo ansible-tmp-1726853172.4957404-8556-193294778881928="` echo /root/.ansible/tmp/ansible-tmp-1726853172.4957404-8556-193294778881928 `" ) && sleep 0' 7554 1726853172.50892: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853172.51097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853172.51188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853172.53185: stdout chunk (state=3): >>>ansible-tmp-1726853172.4957404-8556-193294778881928=/root/.ansible/tmp/ansible-tmp-1726853172.4957404-8556-193294778881928 <<< 7554 1726853172.53324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853172.53327: stdout chunk (state=3): >>><<< 7554 1726853172.53334: stderr chunk (state=3): >>><<< 7554 1726853172.53351: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853172.4957404-8556-193294778881928=/root/.ansible/tmp/ansible-tmp-1726853172.4957404-8556-193294778881928 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853172.53398: variable 'ansible_module_compression' from source: unknown 7554 1726853172.53436: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 7554 1726853172.53470: variable 'ansible_facts' from source: unknown 7554 1726853172.53558: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853172.4957404-8556-193294778881928/AnsiballZ_ping.py 7554 1726853172.53927: Sending initial data 7554 1726853172.53930: Sent initial data (151 bytes) 7554 1726853172.55299: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853172.55304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853172.55464: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853172.55468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853172.55567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853172.55588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853172.57277: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853172.57306: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853172.57363: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpuq8d0waw /root/.ansible/tmp/ansible-tmp-1726853172.4957404-8556-193294778881928/AnsiballZ_ping.py <<< 7554 1726853172.57367: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853172.4957404-8556-193294778881928/AnsiballZ_ping.py" <<< 7554 1726853172.57442: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpuq8d0waw" to remote "/root/.ansible/tmp/ansible-tmp-1726853172.4957404-8556-193294778881928/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853172.4957404-8556-193294778881928/AnsiballZ_ping.py" <<< 7554 1726853172.58743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853172.58848: stderr chunk (state=3): >>><<< 7554 1726853172.58857: stdout chunk (state=3): >>><<< 7554 1726853172.59057: done transferring module to remote 7554 1726853172.59060: _low_level_execute_command(): starting 7554 1726853172.59080: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853172.4957404-8556-193294778881928/ /root/.ansible/tmp/ansible-tmp-1726853172.4957404-8556-193294778881928/AnsiballZ_ping.py && sleep 0' 7554 1726853172.60391: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853172.60496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853172.60514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853172.60532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853172.60623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853172.62561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853172.62565: stdout chunk (state=3): >>><<< 7554 1726853172.62573: stderr chunk (state=3): >>><<< 7554 1726853172.62592: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853172.62599: _low_level_execute_command(): starting 7554 1726853172.62602: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853172.4957404-8556-193294778881928/AnsiballZ_ping.py && sleep 0' 7554 1726853172.63814: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853172.63825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853172.63836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853172.63854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853172.63868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853172.63877: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853172.64169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853172.64209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853172.64277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853172.79605: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7554 1726853172.81380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853172.81383: stdout chunk (state=3): >>><<< 7554 1726853172.81386: stderr chunk (state=3): >>><<< 7554 1726853172.81388: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853172.81391: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853172.4957404-8556-193294778881928/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853172.81393: _low_level_execute_command(): starting 7554 1726853172.81395: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853172.4957404-8556-193294778881928/ > /dev/null 2>&1 && sleep 0' 7554 1726853172.82498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853172.82506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853172.82517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853172.82532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853172.82548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853172.82556: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853172.82566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853172.82583: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853172.82594: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853172.82602: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7554 1726853172.82605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853172.82677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853172.82680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853172.82682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853172.82684: stderr chunk (state=3): >>>debug2: match found <<< 7554 1726853172.82686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853172.82907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853172.82965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853172.85082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853172.85087: stderr chunk (state=3): >>><<< 7554 1726853172.85089: stdout chunk (state=3): >>><<< 7554 1726853172.85092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853172.85098: handler run complete 7554 1726853172.85101: attempt loop complete, returning result 7554 1726853172.85103: _execute() done 7554 1726853172.85105: dumping result to json 7554 1726853172.85107: done dumping result, returning 7554 1726853172.85110: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-bdc3-98b6-00000000007e] 7554 1726853172.85112: sending task result for task 02083763-bbaf-bdc3-98b6-00000000007e 7554 1726853172.85177: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000007e 7554 1726853172.85179: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7554 1726853172.85243: no more pending results, returning what we have 7554 1726853172.85247: results queue empty 7554 1726853172.85248: checking for any_errors_fatal 7554 1726853172.85252: done checking for any_errors_fatal 7554 1726853172.85253: checking for max_fail_percentage 7554 1726853172.85254: done checking for max_fail_percentage 7554 1726853172.85255: checking to see if all hosts have failed and the running result is not ok 7554 1726853172.85256: done checking to see if all hosts have failed 7554 1726853172.85257: getting the remaining hosts for this loop 7554 1726853172.85259: done getting the remaining hosts for this loop 7554 1726853172.85262: getting the next task for host managed_node3 7554 1726853172.85272: done getting next task for host managed_node3 7554 1726853172.85275: ^ task is: TASK: meta (role_complete) 7554 1726853172.85278: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853172.85289: getting variables 7554 1726853172.85291: in VariableManager get_vars() 7554 1726853172.85335: Calling all_inventory to load vars for managed_node3 7554 1726853172.85337: Calling groups_inventory to load vars for managed_node3 7554 1726853172.85339: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853172.85348: Calling all_plugins_play to load vars for managed_node3 7554 1726853172.85351: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853172.85354: Calling groups_plugins_play to load vars for managed_node3 7554 1726853172.88391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853173.02535: done with get_vars() 7554 1726853173.02566: done getting variables 7554 1726853173.02707: done queuing things up, now waiting for results queue to drain 7554 1726853173.02709: results queue empty 7554 1726853173.02710: checking for any_errors_fatal 7554 1726853173.02713: done checking for any_errors_fatal 7554 1726853173.02714: checking for max_fail_percentage 7554 1726853173.02715: done checking for max_fail_percentage 7554 1726853173.02716: checking to see if all hosts have failed and the running result is not ok 7554 1726853173.02717: done checking to see if all hosts have failed 7554 1726853173.02717: getting the remaining hosts for this loop 7554 1726853173.02718: done getting the remaining hosts for this loop 7554 1726853173.02722: getting the next task for host managed_node3 7554 1726853173.02726: done getting next task for host managed_node3 7554 1726853173.02728: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7554 1726853173.02729: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853173.02732: getting variables 7554 1726853173.02733: in VariableManager get_vars() 7554 1726853173.02866: Calling all_inventory to load vars for managed_node3 7554 1726853173.02869: Calling groups_inventory to load vars for managed_node3 7554 1726853173.02873: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853173.02878: Calling all_plugins_play to load vars for managed_node3 7554 1726853173.02881: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853173.02884: Calling groups_plugins_play to load vars for managed_node3 7554 1726853173.05130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853173.07995: done with get_vars() 7554 1726853173.08017: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:79 Friday 20 September 2024 13:26:13 -0400 (0:00:00.668) 0:00:27.048 ****** 7554 1726853173.08088: entering _queue_task() for managed_node3/include_tasks 7554 1726853173.08450: worker is 1 (out of 1 available) 7554 1726853173.08462: exiting _queue_task() for managed_node3/include_tasks 7554 1726853173.08578: done queuing things up, now waiting for results queue to drain 7554 1726853173.08581: waiting for pending results... 7554 1726853173.08776: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7554 1726853173.08897: in run() - task 02083763-bbaf-bdc3-98b6-0000000000ae 7554 1726853173.08921: variable 'ansible_search_path' from source: unknown 7554 1726853173.08967: calling self._execute() 7554 1726853173.09154: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853173.09259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853173.09372: variable 'omit' from source: magic vars 7554 1726853173.09960: variable 'ansible_distribution_major_version' from source: facts 7554 1726853173.10031: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853173.10233: _execute() done 7554 1726853173.10237: dumping result to json 7554 1726853173.10240: done dumping result, returning 7554 1726853173.10243: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [02083763-bbaf-bdc3-98b6-0000000000ae] 7554 1726853173.10245: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000ae 7554 1726853173.10323: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000ae 7554 1726853173.10326: WORKER PROCESS EXITING 7554 1726853173.10364: no more pending results, returning what we have 7554 1726853173.10369: in VariableManager get_vars() 7554 1726853173.10429: Calling all_inventory to load vars for managed_node3 7554 1726853173.10432: Calling groups_inventory to load vars for managed_node3 7554 1726853173.10435: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853173.10562: Calling all_plugins_play to load vars for managed_node3 7554 1726853173.10566: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853173.10570: Calling groups_plugins_play to load vars for managed_node3 7554 1726853173.12332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853173.13899: done with get_vars() 7554 1726853173.13922: variable 'ansible_search_path' from source: unknown 7554 1726853173.13937: we have included files to process 7554 1726853173.13938: generating all_blocks data 7554 1726853173.13940: done generating all_blocks data 7554 1726853173.13945: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7554 1726853173.13947: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7554 1726853173.13950: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7554 1726853173.14349: in VariableManager get_vars() 7554 1726853173.14385: done with get_vars() 7554 1726853173.15028: done processing included file 7554 1726853173.15031: iterating over new_blocks loaded from include file 7554 1726853173.15032: in VariableManager get_vars() 7554 1726853173.15054: done with get_vars() 7554 1726853173.15056: filtering new block on tags 7554 1726853173.15092: done filtering new block on tags 7554 1726853173.15094: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7554 1726853173.15101: extending task lists for all hosts with included blocks 7554 1726853173.21825: done extending task lists 7554 1726853173.21828: done processing included files 7554 1726853173.21829: results queue empty 7554 1726853173.21849: checking for any_errors_fatal 7554 1726853173.21851: done checking for any_errors_fatal 7554 1726853173.21852: checking for max_fail_percentage 7554 1726853173.21853: done checking for max_fail_percentage 7554 1726853173.21854: checking to see if all hosts have failed and the running result is not ok 7554 1726853173.21855: done checking to see if all hosts have failed 7554 1726853173.21856: getting the remaining hosts for this loop 7554 1726853173.21857: done getting the remaining hosts for this loop 7554 1726853173.21860: getting the next task for host managed_node3 7554 1726853173.21864: done getting next task for host managed_node3 7554 1726853173.21867: ^ task is: TASK: Ensure state in ["present", "absent"] 7554 1726853173.21869: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853173.21873: getting variables 7554 1726853173.21874: in VariableManager get_vars() 7554 1726853173.21936: Calling all_inventory to load vars for managed_node3 7554 1726853173.21938: Calling groups_inventory to load vars for managed_node3 7554 1726853173.21943: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853173.21950: Calling all_plugins_play to load vars for managed_node3 7554 1726853173.21952: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853173.21956: Calling groups_plugins_play to load vars for managed_node3 7554 1726853173.23219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853173.25886: done with get_vars() 7554 1726853173.25909: done getting variables 7554 1726853173.25958: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 13:26:13 -0400 (0:00:00.178) 0:00:27.227 ****** 7554 1726853173.25989: entering _queue_task() for managed_node3/fail 7554 1726853173.26639: worker is 1 (out of 1 available) 7554 1726853173.26654: exiting _queue_task() for managed_node3/fail 7554 1726853173.26666: done queuing things up, now waiting for results queue to drain 7554 1726853173.26668: waiting for pending results... 7554 1726853173.27127: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7554 1726853173.27381: in run() - task 02083763-bbaf-bdc3-98b6-000000000dff 7554 1726853173.27404: variable 'ansible_search_path' from source: unknown 7554 1726853173.27414: variable 'ansible_search_path' from source: unknown 7554 1726853173.27458: calling self._execute() 7554 1726853173.27723: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853173.27978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853173.27982: variable 'omit' from source: magic vars 7554 1726853173.29177: variable 'ansible_distribution_major_version' from source: facts 7554 1726853173.29182: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853173.29762: variable 'state' from source: include params 7554 1726853173.29765: Evaluated conditional (state not in ["present", "absent"]): False 7554 1726853173.29768: when evaluation is False, skipping this task 7554 1726853173.30021: _execute() done 7554 1726853173.30025: dumping result to json 7554 1726853173.30027: done dumping result, returning 7554 1726853173.30030: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [02083763-bbaf-bdc3-98b6-000000000dff] 7554 1726853173.30033: sending task result for task 02083763-bbaf-bdc3-98b6-000000000dff 7554 1726853173.30112: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000dff 7554 1726853173.30116: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7554 1726853173.30175: no more pending results, returning what we have 7554 1726853173.30179: results queue empty 7554 1726853173.30181: checking for any_errors_fatal 7554 1726853173.30183: done checking for any_errors_fatal 7554 1726853173.30184: checking for max_fail_percentage 7554 1726853173.30185: done checking for max_fail_percentage 7554 1726853173.30186: checking to see if all hosts have failed and the running result is not ok 7554 1726853173.30188: done checking to see if all hosts have failed 7554 1726853173.30188: getting the remaining hosts for this loop 7554 1726853173.30190: done getting the remaining hosts for this loop 7554 1726853173.30195: getting the next task for host managed_node3 7554 1726853173.30201: done getting next task for host managed_node3 7554 1726853173.30204: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7554 1726853173.30208: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853173.30214: getting variables 7554 1726853173.30216: in VariableManager get_vars() 7554 1726853173.30482: Calling all_inventory to load vars for managed_node3 7554 1726853173.30485: Calling groups_inventory to load vars for managed_node3 7554 1726853173.30488: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853173.30500: Calling all_plugins_play to load vars for managed_node3 7554 1726853173.30504: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853173.30507: Calling groups_plugins_play to load vars for managed_node3 7554 1726853173.32434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853173.34025: done with get_vars() 7554 1726853173.34048: done getting variables 7554 1726853173.34109: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 13:26:13 -0400 (0:00:00.081) 0:00:27.308 ****** 7554 1726853173.34136: entering _queue_task() for managed_node3/fail 7554 1726853173.34689: worker is 1 (out of 1 available) 7554 1726853173.34698: exiting _queue_task() for managed_node3/fail 7554 1726853173.34711: done queuing things up, now waiting for results queue to drain 7554 1726853173.34712: waiting for pending results... 7554 1726853173.34952: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7554 1726853173.34957: in run() - task 02083763-bbaf-bdc3-98b6-000000000e00 7554 1726853173.34978: variable 'ansible_search_path' from source: unknown 7554 1726853173.34987: variable 'ansible_search_path' from source: unknown 7554 1726853173.35028: calling self._execute() 7554 1726853173.35173: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853173.35187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853173.35210: variable 'omit' from source: magic vars 7554 1726853173.35649: variable 'ansible_distribution_major_version' from source: facts 7554 1726853173.35667: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853173.35838: variable 'type' from source: play vars 7554 1726853173.35878: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7554 1726853173.35882: when evaluation is False, skipping this task 7554 1726853173.35885: _execute() done 7554 1726853173.35888: dumping result to json 7554 1726853173.35890: done dumping result, returning 7554 1726853173.35916: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [02083763-bbaf-bdc3-98b6-000000000e00] 7554 1726853173.35919: sending task result for task 02083763-bbaf-bdc3-98b6-000000000e00 skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7554 1726853173.36174: no more pending results, returning what we have 7554 1726853173.36178: results queue empty 7554 1726853173.36179: checking for any_errors_fatal 7554 1726853173.36185: done checking for any_errors_fatal 7554 1726853173.36186: checking for max_fail_percentage 7554 1726853173.36187: done checking for max_fail_percentage 7554 1726853173.36188: checking to see if all hosts have failed and the running result is not ok 7554 1726853173.36189: done checking to see if all hosts have failed 7554 1726853173.36190: getting the remaining hosts for this loop 7554 1726853173.36192: done getting the remaining hosts for this loop 7554 1726853173.36196: getting the next task for host managed_node3 7554 1726853173.36203: done getting next task for host managed_node3 7554 1726853173.36206: ^ task is: TASK: Include the task 'show_interfaces.yml' 7554 1726853173.36210: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853173.36214: getting variables 7554 1726853173.36216: in VariableManager get_vars() 7554 1726853173.36335: Calling all_inventory to load vars for managed_node3 7554 1726853173.36338: Calling groups_inventory to load vars for managed_node3 7554 1726853173.36454: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853173.36542: Calling all_plugins_play to load vars for managed_node3 7554 1726853173.36545: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853173.36548: Calling groups_plugins_play to load vars for managed_node3 7554 1726853173.36562: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000e00 7554 1726853173.36566: WORKER PROCESS EXITING 7554 1726853173.38044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853173.39640: done with get_vars() 7554 1726853173.39661: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 13:26:13 -0400 (0:00:00.056) 0:00:27.365 ****** 7554 1726853173.39754: entering _queue_task() for managed_node3/include_tasks 7554 1726853173.40061: worker is 1 (out of 1 available) 7554 1726853173.40075: exiting _queue_task() for managed_node3/include_tasks 7554 1726853173.40088: done queuing things up, now waiting for results queue to drain 7554 1726853173.40090: waiting for pending results... 7554 1726853173.40342: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7554 1726853173.40511: in run() - task 02083763-bbaf-bdc3-98b6-000000000e01 7554 1726853173.40577: variable 'ansible_search_path' from source: unknown 7554 1726853173.40581: variable 'ansible_search_path' from source: unknown 7554 1726853173.40597: calling self._execute() 7554 1726853173.40735: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853173.40748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853173.40762: variable 'omit' from source: magic vars 7554 1726853173.41276: variable 'ansible_distribution_major_version' from source: facts 7554 1726853173.41279: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853173.41282: _execute() done 7554 1726853173.41285: dumping result to json 7554 1726853173.41287: done dumping result, returning 7554 1726853173.41289: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-bdc3-98b6-000000000e01] 7554 1726853173.41296: sending task result for task 02083763-bbaf-bdc3-98b6-000000000e01 7554 1726853173.41562: no more pending results, returning what we have 7554 1726853173.41568: in VariableManager get_vars() 7554 1726853173.41630: Calling all_inventory to load vars for managed_node3 7554 1726853173.41633: Calling groups_inventory to load vars for managed_node3 7554 1726853173.41636: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853173.41650: Calling all_plugins_play to load vars for managed_node3 7554 1726853173.41653: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853173.41657: Calling groups_plugins_play to load vars for managed_node3 7554 1726853173.42295: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000e01 7554 1726853173.42299: WORKER PROCESS EXITING 7554 1726853173.43221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853173.44954: done with get_vars() 7554 1726853173.44976: variable 'ansible_search_path' from source: unknown 7554 1726853173.44978: variable 'ansible_search_path' from source: unknown 7554 1726853173.45014: we have included files to process 7554 1726853173.45015: generating all_blocks data 7554 1726853173.45024: done generating all_blocks data 7554 1726853173.45030: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7554 1726853173.45037: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7554 1726853173.45041: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7554 1726853173.45170: in VariableManager get_vars() 7554 1726853173.45216: done with get_vars() 7554 1726853173.45378: done processing included file 7554 1726853173.45387: iterating over new_blocks loaded from include file 7554 1726853173.45389: in VariableManager get_vars() 7554 1726853173.45421: done with get_vars() 7554 1726853173.45423: filtering new block on tags 7554 1726853173.45440: done filtering new block on tags 7554 1726853173.45443: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7554 1726853173.45448: extending task lists for all hosts with included blocks 7554 1726853173.45953: done extending task lists 7554 1726853173.45954: done processing included files 7554 1726853173.45955: results queue empty 7554 1726853173.45956: checking for any_errors_fatal 7554 1726853173.45959: done checking for any_errors_fatal 7554 1726853173.45960: checking for max_fail_percentage 7554 1726853173.45961: done checking for max_fail_percentage 7554 1726853173.45962: checking to see if all hosts have failed and the running result is not ok 7554 1726853173.45963: done checking to see if all hosts have failed 7554 1726853173.45964: getting the remaining hosts for this loop 7554 1726853173.45965: done getting the remaining hosts for this loop 7554 1726853173.45967: getting the next task for host managed_node3 7554 1726853173.45973: done getting next task for host managed_node3 7554 1726853173.45975: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7554 1726853173.45984: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853173.45987: getting variables 7554 1726853173.45988: in VariableManager get_vars() 7554 1726853173.46009: Calling all_inventory to load vars for managed_node3 7554 1726853173.46012: Calling groups_inventory to load vars for managed_node3 7554 1726853173.46014: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853173.46031: Calling all_plugins_play to load vars for managed_node3 7554 1726853173.46033: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853173.46043: Calling groups_plugins_play to load vars for managed_node3 7554 1726853173.47474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853173.49055: done with get_vars() 7554 1726853173.49076: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:26:13 -0400 (0:00:00.093) 0:00:27.459 ****** 7554 1726853173.49151: entering _queue_task() for managed_node3/include_tasks 7554 1726853173.49497: worker is 1 (out of 1 available) 7554 1726853173.49511: exiting _queue_task() for managed_node3/include_tasks 7554 1726853173.49523: done queuing things up, now waiting for results queue to drain 7554 1726853173.49524: waiting for pending results... 7554 1726853173.49735: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7554 1726853173.49835: in run() - task 02083763-bbaf-bdc3-98b6-000000001030 7554 1726853173.49850: variable 'ansible_search_path' from source: unknown 7554 1726853173.49854: variable 'ansible_search_path' from source: unknown 7554 1726853173.49889: calling self._execute() 7554 1726853173.49990: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853173.49996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853173.50007: variable 'omit' from source: magic vars 7554 1726853173.50384: variable 'ansible_distribution_major_version' from source: facts 7554 1726853173.50395: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853173.50401: _execute() done 7554 1726853173.50404: dumping result to json 7554 1726853173.50407: done dumping result, returning 7554 1726853173.50413: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-bdc3-98b6-000000001030] 7554 1726853173.50419: sending task result for task 02083763-bbaf-bdc3-98b6-000000001030 7554 1726853173.50508: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001030 7554 1726853173.50511: WORKER PROCESS EXITING 7554 1726853173.50535: no more pending results, returning what we have 7554 1726853173.50540: in VariableManager get_vars() 7554 1726853173.50592: Calling all_inventory to load vars for managed_node3 7554 1726853173.50594: Calling groups_inventory to load vars for managed_node3 7554 1726853173.50597: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853173.50608: Calling all_plugins_play to load vars for managed_node3 7554 1726853173.50611: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853173.50613: Calling groups_plugins_play to load vars for managed_node3 7554 1726853173.51945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853173.53745: done with get_vars() 7554 1726853173.53767: variable 'ansible_search_path' from source: unknown 7554 1726853173.53768: variable 'ansible_search_path' from source: unknown 7554 1726853173.53835: we have included files to process 7554 1726853173.53837: generating all_blocks data 7554 1726853173.53839: done generating all_blocks data 7554 1726853173.53840: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7554 1726853173.53844: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7554 1726853173.53847: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7554 1726853173.54134: done processing included file 7554 1726853173.54136: iterating over new_blocks loaded from include file 7554 1726853173.54138: in VariableManager get_vars() 7554 1726853173.54167: done with get_vars() 7554 1726853173.54169: filtering new block on tags 7554 1726853173.54188: done filtering new block on tags 7554 1726853173.54190: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7554 1726853173.54195: extending task lists for all hosts with included blocks 7554 1726853173.54328: done extending task lists 7554 1726853173.54330: done processing included files 7554 1726853173.54331: results queue empty 7554 1726853173.54331: checking for any_errors_fatal 7554 1726853173.54334: done checking for any_errors_fatal 7554 1726853173.54335: checking for max_fail_percentage 7554 1726853173.54336: done checking for max_fail_percentage 7554 1726853173.54337: checking to see if all hosts have failed and the running result is not ok 7554 1726853173.54338: done checking to see if all hosts have failed 7554 1726853173.54339: getting the remaining hosts for this loop 7554 1726853173.54340: done getting the remaining hosts for this loop 7554 1726853173.54344: getting the next task for host managed_node3 7554 1726853173.54348: done getting next task for host managed_node3 7554 1726853173.54350: ^ task is: TASK: Gather current interface info 7554 1726853173.54354: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853173.54356: getting variables 7554 1726853173.54357: in VariableManager get_vars() 7554 1726853173.54374: Calling all_inventory to load vars for managed_node3 7554 1726853173.54376: Calling groups_inventory to load vars for managed_node3 7554 1726853173.54378: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853173.54384: Calling all_plugins_play to load vars for managed_node3 7554 1726853173.54386: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853173.54388: Calling groups_plugins_play to load vars for managed_node3 7554 1726853173.55428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853173.57047: done with get_vars() 7554 1726853173.57076: done getting variables 7554 1726853173.57125: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:26:13 -0400 (0:00:00.080) 0:00:27.539 ****** 7554 1726853173.57164: entering _queue_task() for managed_node3/command 7554 1726853173.57561: worker is 1 (out of 1 available) 7554 1726853173.57778: exiting _queue_task() for managed_node3/command 7554 1726853173.57791: done queuing things up, now waiting for results queue to drain 7554 1726853173.57793: waiting for pending results... 7554 1726853173.58004: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7554 1726853173.58077: in run() - task 02083763-bbaf-bdc3-98b6-000000001067 7554 1726853173.58083: variable 'ansible_search_path' from source: unknown 7554 1726853173.58086: variable 'ansible_search_path' from source: unknown 7554 1726853173.58178: calling self._execute() 7554 1726853173.58207: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853173.58213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853173.58225: variable 'omit' from source: magic vars 7554 1726853173.58627: variable 'ansible_distribution_major_version' from source: facts 7554 1726853173.58644: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853173.58648: variable 'omit' from source: magic vars 7554 1726853173.58703: variable 'omit' from source: magic vars 7554 1726853173.58739: variable 'omit' from source: magic vars 7554 1726853173.58789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853173.58823: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853173.58843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853173.58880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853173.58884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853173.58977: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853173.58981: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853173.58984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853173.59014: Set connection var ansible_shell_executable to /bin/sh 7554 1726853173.59023: Set connection var ansible_pipelining to False 7554 1726853173.59026: Set connection var ansible_shell_type to sh 7554 1726853173.59028: Set connection var ansible_connection to ssh 7554 1726853173.59037: Set connection var ansible_timeout to 10 7554 1726853173.59042: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853173.59078: variable 'ansible_shell_executable' from source: unknown 7554 1726853173.59081: variable 'ansible_connection' from source: unknown 7554 1726853173.59085: variable 'ansible_module_compression' from source: unknown 7554 1726853173.59087: variable 'ansible_shell_type' from source: unknown 7554 1726853173.59090: variable 'ansible_shell_executable' from source: unknown 7554 1726853173.59091: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853173.59181: variable 'ansible_pipelining' from source: unknown 7554 1726853173.59185: variable 'ansible_timeout' from source: unknown 7554 1726853173.59188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853173.59224: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853173.59234: variable 'omit' from source: magic vars 7554 1726853173.59239: starting attempt loop 7554 1726853173.59242: running the handler 7554 1726853173.59260: _low_level_execute_command(): starting 7554 1726853173.59267: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853173.59974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853173.59985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853173.59998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853173.60082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853173.60085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853173.60087: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853173.60089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853173.60136: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853173.60143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853173.60162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853173.60267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853173.61985: stdout chunk (state=3): >>>/root <<< 7554 1726853173.62151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853173.62155: stdout chunk (state=3): >>><<< 7554 1726853173.62157: stderr chunk (state=3): >>><<< 7554 1726853173.62197: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853173.62312: _low_level_execute_command(): starting 7554 1726853173.62316: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853173.6220846-8593-203416795687124 `" && echo ansible-tmp-1726853173.6220846-8593-203416795687124="` echo /root/.ansible/tmp/ansible-tmp-1726853173.6220846-8593-203416795687124 `" ) && sleep 0' 7554 1726853173.62943: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853173.62956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853173.62959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853173.63000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853173.63100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853173.65095: stdout chunk (state=3): >>>ansible-tmp-1726853173.6220846-8593-203416795687124=/root/.ansible/tmp/ansible-tmp-1726853173.6220846-8593-203416795687124 <<< 7554 1726853173.65265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853173.65268: stdout chunk (state=3): >>><<< 7554 1726853173.65272: stderr chunk (state=3): >>><<< 7554 1726853173.65512: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853173.6220846-8593-203416795687124=/root/.ansible/tmp/ansible-tmp-1726853173.6220846-8593-203416795687124 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853173.65515: variable 'ansible_module_compression' from source: unknown 7554 1726853173.65517: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853173.65519: variable 'ansible_facts' from source: unknown 7554 1726853173.65562: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853173.6220846-8593-203416795687124/AnsiballZ_command.py 7554 1726853173.65691: Sending initial data 7554 1726853173.65754: Sent initial data (154 bytes) 7554 1726853173.66367: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853173.66408: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853173.66421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853173.66473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853173.66535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853173.66555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853173.66582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853173.66665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853173.68337: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7554 1726853173.68355: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853173.68408: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853173.68463: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpnzq5_7s6 /root/.ansible/tmp/ansible-tmp-1726853173.6220846-8593-203416795687124/AnsiballZ_command.py <<< 7554 1726853173.68472: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853173.6220846-8593-203416795687124/AnsiballZ_command.py" <<< 7554 1726853173.68519: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpnzq5_7s6" to remote "/root/.ansible/tmp/ansible-tmp-1726853173.6220846-8593-203416795687124/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853173.6220846-8593-203416795687124/AnsiballZ_command.py" <<< 7554 1726853173.69378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853173.69381: stderr chunk (state=3): >>><<< 7554 1726853173.69384: stdout chunk (state=3): >>><<< 7554 1726853173.69386: done transferring module to remote 7554 1726853173.69388: _low_level_execute_command(): starting 7554 1726853173.69391: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853173.6220846-8593-203416795687124/ /root/.ansible/tmp/ansible-tmp-1726853173.6220846-8593-203416795687124/AnsiballZ_command.py && sleep 0' 7554 1726853173.69839: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853173.69855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853173.69898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853173.69910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853173.69977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853173.71861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853173.71869: stdout chunk (state=3): >>><<< 7554 1726853173.71881: stderr chunk (state=3): >>><<< 7554 1726853173.71986: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853173.71991: _low_level_execute_command(): starting 7554 1726853173.71995: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853173.6220846-8593-203416795687124/AnsiballZ_command.py && sleep 0' 7554 1726853173.72586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853173.72647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853173.72673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853173.72689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853173.72786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853173.88700: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:26:13.882010", "end": "2024-09-20 13:26:13.885373", "delta": "0:00:00.003363", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853173.90364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853173.90397: stderr chunk (state=3): >>><<< 7554 1726853173.90400: stdout chunk (state=3): >>><<< 7554 1726853173.90416: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:26:13.882010", "end": "2024-09-20 13:26:13.885373", "delta": "0:00:00.003363", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853173.90444: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853173.6220846-8593-203416795687124/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853173.90453: _low_level_execute_command(): starting 7554 1726853173.90458: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853173.6220846-8593-203416795687124/ > /dev/null 2>&1 && sleep 0' 7554 1726853173.90908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853173.90911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853173.90914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853173.90918: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853173.90920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853173.90974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853173.90984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853173.90986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853173.91045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853173.92917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853173.92944: stderr chunk (state=3): >>><<< 7554 1726853173.92948: stdout chunk (state=3): >>><<< 7554 1726853173.92958: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853173.92964: handler run complete 7554 1726853173.92989: Evaluated conditional (False): False 7554 1726853173.92997: attempt loop complete, returning result 7554 1726853173.93000: _execute() done 7554 1726853173.93002: dumping result to json 7554 1726853173.93007: done dumping result, returning 7554 1726853173.93014: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [02083763-bbaf-bdc3-98b6-000000001067] 7554 1726853173.93020: sending task result for task 02083763-bbaf-bdc3-98b6-000000001067 7554 1726853173.93115: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001067 7554 1726853173.93117: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003363", "end": "2024-09-20 13:26:13.885373", "rc": 0, "start": "2024-09-20 13:26:13.882010" } STDOUT: eth0 lo peerveth0 veth0 7554 1726853173.93193: no more pending results, returning what we have 7554 1726853173.93196: results queue empty 7554 1726853173.93197: checking for any_errors_fatal 7554 1726853173.93198: done checking for any_errors_fatal 7554 1726853173.93199: checking for max_fail_percentage 7554 1726853173.93201: done checking for max_fail_percentage 7554 1726853173.93202: checking to see if all hosts have failed and the running result is not ok 7554 1726853173.93203: done checking to see if all hosts have failed 7554 1726853173.93203: getting the remaining hosts for this loop 7554 1726853173.93205: done getting the remaining hosts for this loop 7554 1726853173.93208: getting the next task for host managed_node3 7554 1726853173.93215: done getting next task for host managed_node3 7554 1726853173.93218: ^ task is: TASK: Set current_interfaces 7554 1726853173.93223: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853173.93229: getting variables 7554 1726853173.93231: in VariableManager get_vars() 7554 1726853173.93279: Calling all_inventory to load vars for managed_node3 7554 1726853173.93282: Calling groups_inventory to load vars for managed_node3 7554 1726853173.93284: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853173.93294: Calling all_plugins_play to load vars for managed_node3 7554 1726853173.93296: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853173.93299: Calling groups_plugins_play to load vars for managed_node3 7554 1726853173.94182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853173.95040: done with get_vars() 7554 1726853173.95057: done getting variables 7554 1726853173.95103: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:26:13 -0400 (0:00:00.379) 0:00:27.918 ****** 7554 1726853173.95128: entering _queue_task() for managed_node3/set_fact 7554 1726853173.95353: worker is 1 (out of 1 available) 7554 1726853173.95367: exiting _queue_task() for managed_node3/set_fact 7554 1726853173.95381: done queuing things up, now waiting for results queue to drain 7554 1726853173.95383: waiting for pending results... 7554 1726853173.95554: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7554 1726853173.95637: in run() - task 02083763-bbaf-bdc3-98b6-000000001068 7554 1726853173.95649: variable 'ansible_search_path' from source: unknown 7554 1726853173.95652: variable 'ansible_search_path' from source: unknown 7554 1726853173.95682: calling self._execute() 7554 1726853173.95755: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853173.95760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853173.95769: variable 'omit' from source: magic vars 7554 1726853173.96046: variable 'ansible_distribution_major_version' from source: facts 7554 1726853173.96056: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853173.96059: variable 'omit' from source: magic vars 7554 1726853173.96092: variable 'omit' from source: magic vars 7554 1726853173.96165: variable '_current_interfaces' from source: set_fact 7554 1726853173.96213: variable 'omit' from source: magic vars 7554 1726853173.96246: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853173.96274: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853173.96291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853173.96304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853173.96313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853173.96336: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853173.96339: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853173.96344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853173.96414: Set connection var ansible_shell_executable to /bin/sh 7554 1726853173.96421: Set connection var ansible_pipelining to False 7554 1726853173.96423: Set connection var ansible_shell_type to sh 7554 1726853173.96426: Set connection var ansible_connection to ssh 7554 1726853173.96433: Set connection var ansible_timeout to 10 7554 1726853173.96438: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853173.96456: variable 'ansible_shell_executable' from source: unknown 7554 1726853173.96459: variable 'ansible_connection' from source: unknown 7554 1726853173.96462: variable 'ansible_module_compression' from source: unknown 7554 1726853173.96464: variable 'ansible_shell_type' from source: unknown 7554 1726853173.96466: variable 'ansible_shell_executable' from source: unknown 7554 1726853173.96469: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853173.96473: variable 'ansible_pipelining' from source: unknown 7554 1726853173.96476: variable 'ansible_timeout' from source: unknown 7554 1726853173.96482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853173.96579: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853173.96590: variable 'omit' from source: magic vars 7554 1726853173.96594: starting attempt loop 7554 1726853173.96597: running the handler 7554 1726853173.96608: handler run complete 7554 1726853173.96616: attempt loop complete, returning result 7554 1726853173.96619: _execute() done 7554 1726853173.96622: dumping result to json 7554 1726853173.96624: done dumping result, returning 7554 1726853173.96631: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [02083763-bbaf-bdc3-98b6-000000001068] 7554 1726853173.96636: sending task result for task 02083763-bbaf-bdc3-98b6-000000001068 ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo", "peerveth0", "veth0" ] }, "changed": false } 7554 1726853173.96769: no more pending results, returning what we have 7554 1726853173.96774: results queue empty 7554 1726853173.96775: checking for any_errors_fatal 7554 1726853173.96784: done checking for any_errors_fatal 7554 1726853173.96784: checking for max_fail_percentage 7554 1726853173.96786: done checking for max_fail_percentage 7554 1726853173.96787: checking to see if all hosts have failed and the running result is not ok 7554 1726853173.96788: done checking to see if all hosts have failed 7554 1726853173.96789: getting the remaining hosts for this loop 7554 1726853173.96790: done getting the remaining hosts for this loop 7554 1726853173.96793: getting the next task for host managed_node3 7554 1726853173.96800: done getting next task for host managed_node3 7554 1726853173.96802: ^ task is: TASK: Show current_interfaces 7554 1726853173.96806: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853173.96810: getting variables 7554 1726853173.96811: in VariableManager get_vars() 7554 1726853173.96855: Calling all_inventory to load vars for managed_node3 7554 1726853173.96857: Calling groups_inventory to load vars for managed_node3 7554 1726853173.96859: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853173.96868: Calling all_plugins_play to load vars for managed_node3 7554 1726853173.96877: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853173.96883: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001068 7554 1726853173.96885: WORKER PROCESS EXITING 7554 1726853173.96888: Calling groups_plugins_play to load vars for managed_node3 7554 1726853173.97628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853173.98495: done with get_vars() 7554 1726853173.98514: done getting variables 7554 1726853173.98558: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:26:13 -0400 (0:00:00.034) 0:00:27.953 ****** 7554 1726853173.98584: entering _queue_task() for managed_node3/debug 7554 1726853173.98832: worker is 1 (out of 1 available) 7554 1726853173.98851: exiting _queue_task() for managed_node3/debug 7554 1726853173.98863: done queuing things up, now waiting for results queue to drain 7554 1726853173.98865: waiting for pending results... 7554 1726853173.99046: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7554 1726853173.99122: in run() - task 02083763-bbaf-bdc3-98b6-000000001031 7554 1726853173.99133: variable 'ansible_search_path' from source: unknown 7554 1726853173.99137: variable 'ansible_search_path' from source: unknown 7554 1726853173.99165: calling self._execute() 7554 1726853173.99243: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853173.99248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853173.99255: variable 'omit' from source: magic vars 7554 1726853173.99531: variable 'ansible_distribution_major_version' from source: facts 7554 1726853173.99544: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853173.99547: variable 'omit' from source: magic vars 7554 1726853173.99576: variable 'omit' from source: magic vars 7554 1726853173.99645: variable 'current_interfaces' from source: set_fact 7554 1726853173.99665: variable 'omit' from source: magic vars 7554 1726853173.99699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853173.99725: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853173.99747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853173.99759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853173.99769: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853173.99794: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853173.99797: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853173.99800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853173.99873: Set connection var ansible_shell_executable to /bin/sh 7554 1726853173.99881: Set connection var ansible_pipelining to False 7554 1726853173.99884: Set connection var ansible_shell_type to sh 7554 1726853173.99887: Set connection var ansible_connection to ssh 7554 1726853173.99895: Set connection var ansible_timeout to 10 7554 1726853173.99899: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853173.99917: variable 'ansible_shell_executable' from source: unknown 7554 1726853173.99920: variable 'ansible_connection' from source: unknown 7554 1726853173.99923: variable 'ansible_module_compression' from source: unknown 7554 1726853173.99925: variable 'ansible_shell_type' from source: unknown 7554 1726853173.99927: variable 'ansible_shell_executable' from source: unknown 7554 1726853173.99930: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853173.99934: variable 'ansible_pipelining' from source: unknown 7554 1726853173.99936: variable 'ansible_timeout' from source: unknown 7554 1726853173.99943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853174.00043: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853174.00051: variable 'omit' from source: magic vars 7554 1726853174.00056: starting attempt loop 7554 1726853174.00059: running the handler 7554 1726853174.00100: handler run complete 7554 1726853174.00110: attempt loop complete, returning result 7554 1726853174.00113: _execute() done 7554 1726853174.00116: dumping result to json 7554 1726853174.00118: done dumping result, returning 7554 1726853174.00124: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [02083763-bbaf-bdc3-98b6-000000001031] 7554 1726853174.00129: sending task result for task 02083763-bbaf-bdc3-98b6-000000001031 7554 1726853174.00208: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001031 7554 1726853174.00210: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo', 'peerveth0', 'veth0'] 7554 1726853174.00262: no more pending results, returning what we have 7554 1726853174.00265: results queue empty 7554 1726853174.00266: checking for any_errors_fatal 7554 1726853174.00272: done checking for any_errors_fatal 7554 1726853174.00273: checking for max_fail_percentage 7554 1726853174.00274: done checking for max_fail_percentage 7554 1726853174.00275: checking to see if all hosts have failed and the running result is not ok 7554 1726853174.00276: done checking to see if all hosts have failed 7554 1726853174.00277: getting the remaining hosts for this loop 7554 1726853174.00278: done getting the remaining hosts for this loop 7554 1726853174.00282: getting the next task for host managed_node3 7554 1726853174.00289: done getting next task for host managed_node3 7554 1726853174.00292: ^ task is: TASK: Install iproute 7554 1726853174.00295: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853174.00301: getting variables 7554 1726853174.00302: in VariableManager get_vars() 7554 1726853174.00352: Calling all_inventory to load vars for managed_node3 7554 1726853174.00354: Calling groups_inventory to load vars for managed_node3 7554 1726853174.00356: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853174.00366: Calling all_plugins_play to load vars for managed_node3 7554 1726853174.00368: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853174.00377: Calling groups_plugins_play to load vars for managed_node3 7554 1726853174.01281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853174.02125: done with get_vars() 7554 1726853174.02144: done getting variables 7554 1726853174.02188: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 13:26:14 -0400 (0:00:00.036) 0:00:27.989 ****** 7554 1726853174.02210: entering _queue_task() for managed_node3/package 7554 1726853174.02464: worker is 1 (out of 1 available) 7554 1726853174.02477: exiting _queue_task() for managed_node3/package 7554 1726853174.02488: done queuing things up, now waiting for results queue to drain 7554 1726853174.02490: waiting for pending results... 7554 1726853174.02676: running TaskExecutor() for managed_node3/TASK: Install iproute 7554 1726853174.02754: in run() - task 02083763-bbaf-bdc3-98b6-000000000e02 7554 1726853174.02767: variable 'ansible_search_path' from source: unknown 7554 1726853174.02772: variable 'ansible_search_path' from source: unknown 7554 1726853174.02808: calling self._execute() 7554 1726853174.02879: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853174.02883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853174.02892: variable 'omit' from source: magic vars 7554 1726853174.03168: variable 'ansible_distribution_major_version' from source: facts 7554 1726853174.03181: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853174.03187: variable 'omit' from source: magic vars 7554 1726853174.03213: variable 'omit' from source: magic vars 7554 1726853174.03349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853174.04795: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853174.04837: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853174.04866: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853174.04896: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853174.04915: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853174.04990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853174.05019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853174.05036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853174.05063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853174.05074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853174.05149: variable '__network_is_ostree' from source: set_fact 7554 1726853174.05153: variable 'omit' from source: magic vars 7554 1726853174.05179: variable 'omit' from source: magic vars 7554 1726853174.05208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853174.05224: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853174.05239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853174.05253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853174.05262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853174.05287: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853174.05290: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853174.05293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853174.05363: Set connection var ansible_shell_executable to /bin/sh 7554 1726853174.05373: Set connection var ansible_pipelining to False 7554 1726853174.05376: Set connection var ansible_shell_type to sh 7554 1726853174.05378: Set connection var ansible_connection to ssh 7554 1726853174.05386: Set connection var ansible_timeout to 10 7554 1726853174.05391: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853174.05408: variable 'ansible_shell_executable' from source: unknown 7554 1726853174.05411: variable 'ansible_connection' from source: unknown 7554 1726853174.05414: variable 'ansible_module_compression' from source: unknown 7554 1726853174.05425: variable 'ansible_shell_type' from source: unknown 7554 1726853174.05429: variable 'ansible_shell_executable' from source: unknown 7554 1726853174.05431: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853174.05433: variable 'ansible_pipelining' from source: unknown 7554 1726853174.05435: variable 'ansible_timeout' from source: unknown 7554 1726853174.05437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853174.05502: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853174.05511: variable 'omit' from source: magic vars 7554 1726853174.05516: starting attempt loop 7554 1726853174.05519: running the handler 7554 1726853174.05530: variable 'ansible_facts' from source: unknown 7554 1726853174.05533: variable 'ansible_facts' from source: unknown 7554 1726853174.05559: _low_level_execute_command(): starting 7554 1726853174.05566: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853174.06049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853174.06076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853174.06080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853174.06082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.06085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853174.06088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853174.06101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.06157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853174.06160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853174.06162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853174.06236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853174.07948: stdout chunk (state=3): >>>/root <<< 7554 1726853174.08048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853174.08081: stderr chunk (state=3): >>><<< 7554 1726853174.08084: stdout chunk (state=3): >>><<< 7554 1726853174.08104: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853174.08114: _low_level_execute_command(): starting 7554 1726853174.08122: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853174.081039-8609-137690609476378 `" && echo ansible-tmp-1726853174.081039-8609-137690609476378="` echo /root/.ansible/tmp/ansible-tmp-1726853174.081039-8609-137690609476378 `" ) && sleep 0' 7554 1726853174.08628: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853174.08632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853174.08635: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.08637: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853174.08639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853174.08641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.08691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853174.08695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853174.08710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853174.08770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853174.10789: stdout chunk (state=3): >>>ansible-tmp-1726853174.081039-8609-137690609476378=/root/.ansible/tmp/ansible-tmp-1726853174.081039-8609-137690609476378 <<< 7554 1726853174.10949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853174.10955: stdout chunk (state=3): >>><<< 7554 1726853174.10957: stderr chunk (state=3): >>><<< 7554 1726853174.11001: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853174.081039-8609-137690609476378=/root/.ansible/tmp/ansible-tmp-1726853174.081039-8609-137690609476378 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853174.11032: variable 'ansible_module_compression' from source: unknown 7554 1726853174.11078: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 7554 1726853174.11114: variable 'ansible_facts' from source: unknown 7554 1726853174.11202: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853174.081039-8609-137690609476378/AnsiballZ_dnf.py 7554 1726853174.11306: Sending initial data 7554 1726853174.11310: Sent initial data (149 bytes) 7554 1726853174.11758: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853174.11763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853174.11765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853174.11768: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853174.11773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.11815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853174.11822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853174.11824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853174.11889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853174.13545: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853174.13632: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853174.13696: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpi8bg5b_u /root/.ansible/tmp/ansible-tmp-1726853174.081039-8609-137690609476378/AnsiballZ_dnf.py <<< 7554 1726853174.13700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853174.081039-8609-137690609476378/AnsiballZ_dnf.py" <<< 7554 1726853174.13754: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpi8bg5b_u" to remote "/root/.ansible/tmp/ansible-tmp-1726853174.081039-8609-137690609476378/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853174.081039-8609-137690609476378/AnsiballZ_dnf.py" <<< 7554 1726853174.14912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853174.14915: stdout chunk (state=3): >>><<< 7554 1726853174.14917: stderr chunk (state=3): >>><<< 7554 1726853174.14926: done transferring module to remote 7554 1726853174.14944: _low_level_execute_command(): starting 7554 1726853174.14964: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853174.081039-8609-137690609476378/ /root/.ansible/tmp/ansible-tmp-1726853174.081039-8609-137690609476378/AnsiballZ_dnf.py && sleep 0' 7554 1726853174.15596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853174.15618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853174.15633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853174.15653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853174.15670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853174.15725: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.15781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853174.15811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853174.15831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853174.15920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853174.17847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853174.17869: stdout chunk (state=3): >>><<< 7554 1726853174.17875: stderr chunk (state=3): >>><<< 7554 1726853174.17891: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853174.17980: _low_level_execute_command(): starting 7554 1726853174.17984: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853174.081039-8609-137690609476378/AnsiballZ_dnf.py && sleep 0' 7554 1726853174.18592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.18638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853174.18654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853174.18675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853174.18779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853174.61551: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7554 1726853174.70182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853174.70204: stderr chunk (state=3): >>><<< 7554 1726853174.70207: stdout chunk (state=3): >>><<< 7554 1726853174.70227: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853174.70268: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853174.081039-8609-137690609476378/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853174.70273: _low_level_execute_command(): starting 7554 1726853174.70279: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853174.081039-8609-137690609476378/ > /dev/null 2>&1 && sleep 0' 7554 1726853174.70734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853174.70738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.70740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853174.70742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853174.70744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.70798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853174.70801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853174.70803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853174.70865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853174.72749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853174.72770: stderr chunk (state=3): >>><<< 7554 1726853174.72775: stdout chunk (state=3): >>><<< 7554 1726853174.72787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853174.72796: handler run complete 7554 1726853174.72912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853174.73042: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853174.73074: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853174.73099: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853174.73135: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853174.73192: variable '__install_status' from source: set_fact 7554 1726853174.73206: Evaluated conditional (__install_status is success): True 7554 1726853174.73218: attempt loop complete, returning result 7554 1726853174.73221: _execute() done 7554 1726853174.73224: dumping result to json 7554 1726853174.73227: done dumping result, returning 7554 1726853174.73235: done running TaskExecutor() for managed_node3/TASK: Install iproute [02083763-bbaf-bdc3-98b6-000000000e02] 7554 1726853174.73241: sending task result for task 02083763-bbaf-bdc3-98b6-000000000e02 7554 1726853174.73332: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000e02 7554 1726853174.73335: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7554 1726853174.73416: no more pending results, returning what we have 7554 1726853174.73420: results queue empty 7554 1726853174.73421: checking for any_errors_fatal 7554 1726853174.73426: done checking for any_errors_fatal 7554 1726853174.73427: checking for max_fail_percentage 7554 1726853174.73428: done checking for max_fail_percentage 7554 1726853174.73429: checking to see if all hosts have failed and the running result is not ok 7554 1726853174.73430: done checking to see if all hosts have failed 7554 1726853174.73431: getting the remaining hosts for this loop 7554 1726853174.73432: done getting the remaining hosts for this loop 7554 1726853174.73435: getting the next task for host managed_node3 7554 1726853174.73441: done getting next task for host managed_node3 7554 1726853174.73444: ^ task is: TASK: Create veth interface {{ interface }} 7554 1726853174.73453: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853174.73457: getting variables 7554 1726853174.73459: in VariableManager get_vars() 7554 1726853174.73506: Calling all_inventory to load vars for managed_node3 7554 1726853174.73509: Calling groups_inventory to load vars for managed_node3 7554 1726853174.73511: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853174.73521: Calling all_plugins_play to load vars for managed_node3 7554 1726853174.73523: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853174.73525: Calling groups_plugins_play to load vars for managed_node3 7554 1726853174.74309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853174.75155: done with get_vars() 7554 1726853174.75172: done getting variables 7554 1726853174.75215: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853174.75304: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 13:26:14 -0400 (0:00:00.731) 0:00:28.720 ****** 7554 1726853174.75327: entering _queue_task() for managed_node3/command 7554 1726853174.75546: worker is 1 (out of 1 available) 7554 1726853174.75560: exiting _queue_task() for managed_node3/command 7554 1726853174.75572: done queuing things up, now waiting for results queue to drain 7554 1726853174.75575: waiting for pending results... 7554 1726853174.75746: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7554 1726853174.75824: in run() - task 02083763-bbaf-bdc3-98b6-000000000e03 7554 1726853174.75836: variable 'ansible_search_path' from source: unknown 7554 1726853174.75840: variable 'ansible_search_path' from source: unknown 7554 1726853174.76045: variable 'interface' from source: play vars 7554 1726853174.76107: variable 'interface' from source: play vars 7554 1726853174.76161: variable 'interface' from source: play vars 7554 1726853174.76276: Loaded config def from plugin (lookup/items) 7554 1726853174.76287: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7554 1726853174.76306: variable 'omit' from source: magic vars 7554 1726853174.76403: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853174.76411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853174.76420: variable 'omit' from source: magic vars 7554 1726853174.76588: variable 'ansible_distribution_major_version' from source: facts 7554 1726853174.76595: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853174.76723: variable 'type' from source: play vars 7554 1726853174.76727: variable 'state' from source: include params 7554 1726853174.76730: variable 'interface' from source: play vars 7554 1726853174.76735: variable 'current_interfaces' from source: set_fact 7554 1726853174.76742: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7554 1726853174.76748: when evaluation is False, skipping this task 7554 1726853174.76767: variable 'item' from source: unknown 7554 1726853174.76821: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 7554 1726853174.76964: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853174.76968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853174.76970: variable 'omit' from source: magic vars 7554 1726853174.77030: variable 'ansible_distribution_major_version' from source: facts 7554 1726853174.77033: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853174.77150: variable 'type' from source: play vars 7554 1726853174.77153: variable 'state' from source: include params 7554 1726853174.77156: variable 'interface' from source: play vars 7554 1726853174.77160: variable 'current_interfaces' from source: set_fact 7554 1726853174.77166: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7554 1726853174.77168: when evaluation is False, skipping this task 7554 1726853174.77191: variable 'item' from source: unknown 7554 1726853174.77232: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 7554 1726853174.77298: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853174.77304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853174.77313: variable 'omit' from source: magic vars 7554 1726853174.77405: variable 'ansible_distribution_major_version' from source: facts 7554 1726853174.77408: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853174.77523: variable 'type' from source: play vars 7554 1726853174.77526: variable 'state' from source: include params 7554 1726853174.77529: variable 'interface' from source: play vars 7554 1726853174.77532: variable 'current_interfaces' from source: set_fact 7554 1726853174.77541: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7554 1726853174.77543: when evaluation is False, skipping this task 7554 1726853174.77561: variable 'item' from source: unknown 7554 1726853174.77605: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 7554 1726853174.77681: dumping result to json 7554 1726853174.77684: done dumping result, returning 7554 1726853174.77685: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [02083763-bbaf-bdc3-98b6-000000000e03] 7554 1726853174.77687: sending task result for task 02083763-bbaf-bdc3-98b6-000000000e03 7554 1726853174.77720: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000e03 7554 1726853174.77723: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false } MSG: All items skipped 7554 1726853174.77817: no more pending results, returning what we have 7554 1726853174.77821: results queue empty 7554 1726853174.77822: checking for any_errors_fatal 7554 1726853174.77828: done checking for any_errors_fatal 7554 1726853174.77829: checking for max_fail_percentage 7554 1726853174.77830: done checking for max_fail_percentage 7554 1726853174.77830: checking to see if all hosts have failed and the running result is not ok 7554 1726853174.77831: done checking to see if all hosts have failed 7554 1726853174.77832: getting the remaining hosts for this loop 7554 1726853174.77833: done getting the remaining hosts for this loop 7554 1726853174.77836: getting the next task for host managed_node3 7554 1726853174.77841: done getting next task for host managed_node3 7554 1726853174.77843: ^ task is: TASK: Set up veth as managed by NetworkManager 7554 1726853174.77846: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853174.77849: getting variables 7554 1726853174.77851: in VariableManager get_vars() 7554 1726853174.77893: Calling all_inventory to load vars for managed_node3 7554 1726853174.77895: Calling groups_inventory to load vars for managed_node3 7554 1726853174.77897: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853174.77906: Calling all_plugins_play to load vars for managed_node3 7554 1726853174.77908: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853174.77910: Calling groups_plugins_play to load vars for managed_node3 7554 1726853174.78729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853174.79577: done with get_vars() 7554 1726853174.79592: done getting variables 7554 1726853174.79632: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 13:26:14 -0400 (0:00:00.043) 0:00:28.764 ****** 7554 1726853174.79659: entering _queue_task() for managed_node3/command 7554 1726853174.79879: worker is 1 (out of 1 available) 7554 1726853174.79893: exiting _queue_task() for managed_node3/command 7554 1726853174.79905: done queuing things up, now waiting for results queue to drain 7554 1726853174.79906: waiting for pending results... 7554 1726853174.80087: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7554 1726853174.80159: in run() - task 02083763-bbaf-bdc3-98b6-000000000e04 7554 1726853174.80172: variable 'ansible_search_path' from source: unknown 7554 1726853174.80177: variable 'ansible_search_path' from source: unknown 7554 1726853174.80203: calling self._execute() 7554 1726853174.80283: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853174.80288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853174.80296: variable 'omit' from source: magic vars 7554 1726853174.80564: variable 'ansible_distribution_major_version' from source: facts 7554 1726853174.80578: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853174.80686: variable 'type' from source: play vars 7554 1726853174.80689: variable 'state' from source: include params 7554 1726853174.80693: Evaluated conditional (type == 'veth' and state == 'present'): False 7554 1726853174.80695: when evaluation is False, skipping this task 7554 1726853174.80699: _execute() done 7554 1726853174.80702: dumping result to json 7554 1726853174.80706: done dumping result, returning 7554 1726853174.80712: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [02083763-bbaf-bdc3-98b6-000000000e04] 7554 1726853174.80718: sending task result for task 02083763-bbaf-bdc3-98b6-000000000e04 7554 1726853174.80797: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000e04 7554 1726853174.80800: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 7554 1726853174.80843: no more pending results, returning what we have 7554 1726853174.80847: results queue empty 7554 1726853174.80847: checking for any_errors_fatal 7554 1726853174.80859: done checking for any_errors_fatal 7554 1726853174.80860: checking for max_fail_percentage 7554 1726853174.80862: done checking for max_fail_percentage 7554 1726853174.80862: checking to see if all hosts have failed and the running result is not ok 7554 1726853174.80863: done checking to see if all hosts have failed 7554 1726853174.80864: getting the remaining hosts for this loop 7554 1726853174.80866: done getting the remaining hosts for this loop 7554 1726853174.80870: getting the next task for host managed_node3 7554 1726853174.80877: done getting next task for host managed_node3 7554 1726853174.80880: ^ task is: TASK: Delete veth interface {{ interface }} 7554 1726853174.80883: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853174.80887: getting variables 7554 1726853174.80889: in VariableManager get_vars() 7554 1726853174.80929: Calling all_inventory to load vars for managed_node3 7554 1726853174.80931: Calling groups_inventory to load vars for managed_node3 7554 1726853174.80933: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853174.80943: Calling all_plugins_play to load vars for managed_node3 7554 1726853174.80946: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853174.80948: Calling groups_plugins_play to load vars for managed_node3 7554 1726853174.81693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853174.82625: done with get_vars() 7554 1726853174.82639: done getting variables 7554 1726853174.82682: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853174.82761: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 13:26:14 -0400 (0:00:00.031) 0:00:28.795 ****** 7554 1726853174.82785: entering _queue_task() for managed_node3/command 7554 1726853174.82997: worker is 1 (out of 1 available) 7554 1726853174.83012: exiting _queue_task() for managed_node3/command 7554 1726853174.83024: done queuing things up, now waiting for results queue to drain 7554 1726853174.83026: waiting for pending results... 7554 1726853174.83199: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7554 1726853174.83266: in run() - task 02083763-bbaf-bdc3-98b6-000000000e05 7554 1726853174.83280: variable 'ansible_search_path' from source: unknown 7554 1726853174.83284: variable 'ansible_search_path' from source: unknown 7554 1726853174.83309: calling self._execute() 7554 1726853174.83386: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853174.83392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853174.83400: variable 'omit' from source: magic vars 7554 1726853174.83659: variable 'ansible_distribution_major_version' from source: facts 7554 1726853174.83669: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853174.83802: variable 'type' from source: play vars 7554 1726853174.83806: variable 'state' from source: include params 7554 1726853174.83809: variable 'interface' from source: play vars 7554 1726853174.83812: variable 'current_interfaces' from source: set_fact 7554 1726853174.83822: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 7554 1726853174.83827: variable 'omit' from source: magic vars 7554 1726853174.83855: variable 'omit' from source: magic vars 7554 1726853174.83923: variable 'interface' from source: play vars 7554 1726853174.83936: variable 'omit' from source: magic vars 7554 1726853174.83970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853174.83998: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853174.84016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853174.84030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853174.84039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853174.84064: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853174.84068: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853174.84072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853174.84141: Set connection var ansible_shell_executable to /bin/sh 7554 1726853174.84150: Set connection var ansible_pipelining to False 7554 1726853174.84153: Set connection var ansible_shell_type to sh 7554 1726853174.84155: Set connection var ansible_connection to ssh 7554 1726853174.84162: Set connection var ansible_timeout to 10 7554 1726853174.84167: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853174.84185: variable 'ansible_shell_executable' from source: unknown 7554 1726853174.84188: variable 'ansible_connection' from source: unknown 7554 1726853174.84190: variable 'ansible_module_compression' from source: unknown 7554 1726853174.84192: variable 'ansible_shell_type' from source: unknown 7554 1726853174.84195: variable 'ansible_shell_executable' from source: unknown 7554 1726853174.84197: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853174.84201: variable 'ansible_pipelining' from source: unknown 7554 1726853174.84204: variable 'ansible_timeout' from source: unknown 7554 1726853174.84208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853174.84308: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853174.84316: variable 'omit' from source: magic vars 7554 1726853174.84320: starting attempt loop 7554 1726853174.84324: running the handler 7554 1726853174.84337: _low_level_execute_command(): starting 7554 1726853174.84352: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853174.84857: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853174.84861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.84865: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853174.84867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.84921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853174.84924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853174.84926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853174.84998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853174.86716: stdout chunk (state=3): >>>/root <<< 7554 1726853174.86810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853174.86840: stderr chunk (state=3): >>><<< 7554 1726853174.86845: stdout chunk (state=3): >>><<< 7554 1726853174.86862: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853174.86875: _low_level_execute_command(): starting 7554 1726853174.86881: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853174.868621-8637-148173892647247 `" && echo ansible-tmp-1726853174.868621-8637-148173892647247="` echo /root/.ansible/tmp/ansible-tmp-1726853174.868621-8637-148173892647247 `" ) && sleep 0' 7554 1726853174.87323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853174.87333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853174.87335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853174.87338: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853174.87340: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.87381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853174.87393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853174.87456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853174.89421: stdout chunk (state=3): >>>ansible-tmp-1726853174.868621-8637-148173892647247=/root/.ansible/tmp/ansible-tmp-1726853174.868621-8637-148173892647247 <<< 7554 1726853174.89527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853174.89553: stderr chunk (state=3): >>><<< 7554 1726853174.89556: stdout chunk (state=3): >>><<< 7554 1726853174.89568: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853174.868621-8637-148173892647247=/root/.ansible/tmp/ansible-tmp-1726853174.868621-8637-148173892647247 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853174.89596: variable 'ansible_module_compression' from source: unknown 7554 1726853174.89633: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853174.89672: variable 'ansible_facts' from source: unknown 7554 1726853174.89726: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853174.868621-8637-148173892647247/AnsiballZ_command.py 7554 1726853174.89900: Sending initial data 7554 1726853174.89903: Sent initial data (153 bytes) 7554 1726853174.90317: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853174.90320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853174.90323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.90325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853174.90327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.90380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853174.90383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853174.90451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853174.92061: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853174.92134: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853174.92208: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpx7ugoqgp /root/.ansible/tmp/ansible-tmp-1726853174.868621-8637-148173892647247/AnsiballZ_command.py <<< 7554 1726853174.92211: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853174.868621-8637-148173892647247/AnsiballZ_command.py" <<< 7554 1726853174.92259: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpx7ugoqgp" to remote "/root/.ansible/tmp/ansible-tmp-1726853174.868621-8637-148173892647247/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853174.868621-8637-148173892647247/AnsiballZ_command.py" <<< 7554 1726853174.92987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853174.93027: stderr chunk (state=3): >>><<< 7554 1726853174.93030: stdout chunk (state=3): >>><<< 7554 1726853174.93062: done transferring module to remote 7554 1726853174.93072: _low_level_execute_command(): starting 7554 1726853174.93077: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853174.868621-8637-148173892647247/ /root/.ansible/tmp/ansible-tmp-1726853174.868621-8637-148173892647247/AnsiballZ_command.py && sleep 0' 7554 1726853174.93498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853174.93502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853174.93505: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.93521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.93567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853174.93572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853174.93638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853174.95556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853174.95560: stdout chunk (state=3): >>><<< 7554 1726853174.95562: stderr chunk (state=3): >>><<< 7554 1726853174.95580: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853174.95588: _low_level_execute_command(): starting 7554 1726853174.95596: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853174.868621-8637-148173892647247/AnsiballZ_command.py && sleep 0' 7554 1726853174.96224: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853174.96285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853174.96349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853174.96364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853174.96386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853174.96483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853175.13505: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 13:26:15.121015", "end": "2024-09-20 13:26:15.131410", "delta": "0:00:00.010395", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853175.15982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853175.15986: stdout chunk (state=3): >>><<< 7554 1726853175.15989: stderr chunk (state=3): >>><<< 7554 1726853175.15991: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 13:26:15.121015", "end": "2024-09-20 13:26:15.131410", "delta": "0:00:00.010395", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853175.15994: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853174.868621-8637-148173892647247/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853175.15997: _low_level_execute_command(): starting 7554 1726853175.15999: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853174.868621-8637-148173892647247/ > /dev/null 2>&1 && sleep 0' 7554 1726853175.16757: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853175.16797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853175.16900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853175.18834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853175.18864: stdout chunk (state=3): >>><<< 7554 1726853175.18868: stderr chunk (state=3): >>><<< 7554 1726853175.18949: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853175.18952: handler run complete 7554 1726853175.18955: Evaluated conditional (False): False 7554 1726853175.18957: attempt loop complete, returning result 7554 1726853175.18959: _execute() done 7554 1726853175.18961: dumping result to json 7554 1726853175.18963: done dumping result, returning 7554 1726853175.18965: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [02083763-bbaf-bdc3-98b6-000000000e05] 7554 1726853175.18983: sending task result for task 02083763-bbaf-bdc3-98b6-000000000e05 7554 1726853175.19301: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000e05 7554 1726853175.19304: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.010395", "end": "2024-09-20 13:26:15.131410", "rc": 0, "start": "2024-09-20 13:26:15.121015" } 7554 1726853175.19368: no more pending results, returning what we have 7554 1726853175.19373: results queue empty 7554 1726853175.19374: checking for any_errors_fatal 7554 1726853175.19381: done checking for any_errors_fatal 7554 1726853175.19382: checking for max_fail_percentage 7554 1726853175.19384: done checking for max_fail_percentage 7554 1726853175.19384: checking to see if all hosts have failed and the running result is not ok 7554 1726853175.19386: done checking to see if all hosts have failed 7554 1726853175.19386: getting the remaining hosts for this loop 7554 1726853175.19388: done getting the remaining hosts for this loop 7554 1726853175.19391: getting the next task for host managed_node3 7554 1726853175.19397: done getting next task for host managed_node3 7554 1726853175.19400: ^ task is: TASK: Create dummy interface {{ interface }} 7554 1726853175.19403: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853175.19407: getting variables 7554 1726853175.19409: in VariableManager get_vars() 7554 1726853175.19495: Calling all_inventory to load vars for managed_node3 7554 1726853175.19498: Calling groups_inventory to load vars for managed_node3 7554 1726853175.19500: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853175.19512: Calling all_plugins_play to load vars for managed_node3 7554 1726853175.19515: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853175.19517: Calling groups_plugins_play to load vars for managed_node3 7554 1726853175.20799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853175.22379: done with get_vars() 7554 1726853175.22404: done getting variables 7554 1726853175.22465: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853175.22580: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 13:26:15 -0400 (0:00:00.398) 0:00:29.193 ****** 7554 1726853175.22611: entering _queue_task() for managed_node3/command 7554 1726853175.22960: worker is 1 (out of 1 available) 7554 1726853175.22976: exiting _queue_task() for managed_node3/command 7554 1726853175.22990: done queuing things up, now waiting for results queue to drain 7554 1726853175.22992: waiting for pending results... 7554 1726853175.23350: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7554 1726853175.23379: in run() - task 02083763-bbaf-bdc3-98b6-000000000e06 7554 1726853175.23500: variable 'ansible_search_path' from source: unknown 7554 1726853175.23504: variable 'ansible_search_path' from source: unknown 7554 1726853175.23508: calling self._execute() 7554 1726853175.23576: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853175.23590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853175.23609: variable 'omit' from source: magic vars 7554 1726853175.24007: variable 'ansible_distribution_major_version' from source: facts 7554 1726853175.24026: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853175.24251: variable 'type' from source: play vars 7554 1726853175.24268: variable 'state' from source: include params 7554 1726853175.24280: variable 'interface' from source: play vars 7554 1726853175.24289: variable 'current_interfaces' from source: set_fact 7554 1726853175.24303: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7554 1726853175.24311: when evaluation is False, skipping this task 7554 1726853175.24318: _execute() done 7554 1726853175.24326: dumping result to json 7554 1726853175.24334: done dumping result, returning 7554 1726853175.24347: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [02083763-bbaf-bdc3-98b6-000000000e06] 7554 1726853175.24358: sending task result for task 02083763-bbaf-bdc3-98b6-000000000e06 7554 1726853175.24536: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000e06 7554 1726853175.24539: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853175.24632: no more pending results, returning what we have 7554 1726853175.24636: results queue empty 7554 1726853175.24637: checking for any_errors_fatal 7554 1726853175.24651: done checking for any_errors_fatal 7554 1726853175.24652: checking for max_fail_percentage 7554 1726853175.24653: done checking for max_fail_percentage 7554 1726853175.24654: checking to see if all hosts have failed and the running result is not ok 7554 1726853175.24655: done checking to see if all hosts have failed 7554 1726853175.24656: getting the remaining hosts for this loop 7554 1726853175.24658: done getting the remaining hosts for this loop 7554 1726853175.24662: getting the next task for host managed_node3 7554 1726853175.24669: done getting next task for host managed_node3 7554 1726853175.24674: ^ task is: TASK: Delete dummy interface {{ interface }} 7554 1726853175.24677: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853175.24683: getting variables 7554 1726853175.24685: in VariableManager get_vars() 7554 1726853175.24740: Calling all_inventory to load vars for managed_node3 7554 1726853175.24746: Calling groups_inventory to load vars for managed_node3 7554 1726853175.24749: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853175.24763: Calling all_plugins_play to load vars for managed_node3 7554 1726853175.24766: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853175.24770: Calling groups_plugins_play to load vars for managed_node3 7554 1726853175.26397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853175.27879: done with get_vars() 7554 1726853175.27900: done getting variables 7554 1726853175.27962: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853175.28069: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 13:26:15 -0400 (0:00:00.054) 0:00:29.248 ****** 7554 1726853175.28102: entering _queue_task() for managed_node3/command 7554 1726853175.28416: worker is 1 (out of 1 available) 7554 1726853175.28428: exiting _queue_task() for managed_node3/command 7554 1726853175.28440: done queuing things up, now waiting for results queue to drain 7554 1726853175.28445: waiting for pending results... 7554 1726853175.28735: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7554 1726853175.28895: in run() - task 02083763-bbaf-bdc3-98b6-000000000e07 7554 1726853175.28899: variable 'ansible_search_path' from source: unknown 7554 1726853175.28901: variable 'ansible_search_path' from source: unknown 7554 1726853175.28927: calling self._execute() 7554 1726853175.29031: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853175.29046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853175.29213: variable 'omit' from source: magic vars 7554 1726853175.29478: variable 'ansible_distribution_major_version' from source: facts 7554 1726853175.29495: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853175.29705: variable 'type' from source: play vars 7554 1726853175.29716: variable 'state' from source: include params 7554 1726853175.29729: variable 'interface' from source: play vars 7554 1726853175.29764: variable 'current_interfaces' from source: set_fact 7554 1726853175.29768: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7554 1726853175.29770: when evaluation is False, skipping this task 7554 1726853175.29774: _execute() done 7554 1726853175.29776: dumping result to json 7554 1726853175.29779: done dumping result, returning 7554 1726853175.29787: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [02083763-bbaf-bdc3-98b6-000000000e07] 7554 1726853175.29797: sending task result for task 02083763-bbaf-bdc3-98b6-000000000e07 7554 1726853175.29945: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000e07 7554 1726853175.29948: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853175.30033: no more pending results, returning what we have 7554 1726853175.30036: results queue empty 7554 1726853175.30037: checking for any_errors_fatal 7554 1726853175.30046: done checking for any_errors_fatal 7554 1726853175.30047: checking for max_fail_percentage 7554 1726853175.30048: done checking for max_fail_percentage 7554 1726853175.30049: checking to see if all hosts have failed and the running result is not ok 7554 1726853175.30050: done checking to see if all hosts have failed 7554 1726853175.30051: getting the remaining hosts for this loop 7554 1726853175.30052: done getting the remaining hosts for this loop 7554 1726853175.30055: getting the next task for host managed_node3 7554 1726853175.30061: done getting next task for host managed_node3 7554 1726853175.30063: ^ task is: TASK: Create tap interface {{ interface }} 7554 1726853175.30066: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853175.30070: getting variables 7554 1726853175.30075: in VariableManager get_vars() 7554 1726853175.30124: Calling all_inventory to load vars for managed_node3 7554 1726853175.30126: Calling groups_inventory to load vars for managed_node3 7554 1726853175.30128: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853175.30138: Calling all_plugins_play to load vars for managed_node3 7554 1726853175.30140: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853175.30145: Calling groups_plugins_play to load vars for managed_node3 7554 1726853175.30910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853175.31781: done with get_vars() 7554 1726853175.31795: done getting variables 7554 1726853175.31861: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853175.31966: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 13:26:15 -0400 (0:00:00.038) 0:00:29.287 ****** 7554 1726853175.31997: entering _queue_task() for managed_node3/command 7554 1726853175.32287: worker is 1 (out of 1 available) 7554 1726853175.32300: exiting _queue_task() for managed_node3/command 7554 1726853175.32312: done queuing things up, now waiting for results queue to drain 7554 1726853175.32313: waiting for pending results... 7554 1726853175.32783: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7554 1726853175.32789: in run() - task 02083763-bbaf-bdc3-98b6-000000000e08 7554 1726853175.32792: variable 'ansible_search_path' from source: unknown 7554 1726853175.32795: variable 'ansible_search_path' from source: unknown 7554 1726853175.32797: calling self._execute() 7554 1726853175.32897: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853175.32912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853175.32919: variable 'omit' from source: magic vars 7554 1726853175.33195: variable 'ansible_distribution_major_version' from source: facts 7554 1726853175.33204: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853175.33332: variable 'type' from source: play vars 7554 1726853175.33336: variable 'state' from source: include params 7554 1726853175.33345: variable 'interface' from source: play vars 7554 1726853175.33348: variable 'current_interfaces' from source: set_fact 7554 1726853175.33354: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7554 1726853175.33357: when evaluation is False, skipping this task 7554 1726853175.33360: _execute() done 7554 1726853175.33362: dumping result to json 7554 1726853175.33364: done dumping result, returning 7554 1726853175.33373: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [02083763-bbaf-bdc3-98b6-000000000e08] 7554 1726853175.33378: sending task result for task 02083763-bbaf-bdc3-98b6-000000000e08 7554 1726853175.33461: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000e08 7554 1726853175.33464: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853175.33511: no more pending results, returning what we have 7554 1726853175.33516: results queue empty 7554 1726853175.33517: checking for any_errors_fatal 7554 1726853175.33521: done checking for any_errors_fatal 7554 1726853175.33522: checking for max_fail_percentage 7554 1726853175.33524: done checking for max_fail_percentage 7554 1726853175.33524: checking to see if all hosts have failed and the running result is not ok 7554 1726853175.33525: done checking to see if all hosts have failed 7554 1726853175.33526: getting the remaining hosts for this loop 7554 1726853175.33527: done getting the remaining hosts for this loop 7554 1726853175.33531: getting the next task for host managed_node3 7554 1726853175.33536: done getting next task for host managed_node3 7554 1726853175.33539: ^ task is: TASK: Delete tap interface {{ interface }} 7554 1726853175.33544: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853175.33548: getting variables 7554 1726853175.33550: in VariableManager get_vars() 7554 1726853175.33593: Calling all_inventory to load vars for managed_node3 7554 1726853175.33595: Calling groups_inventory to load vars for managed_node3 7554 1726853175.33597: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853175.33607: Calling all_plugins_play to load vars for managed_node3 7554 1726853175.33610: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853175.33612: Calling groups_plugins_play to load vars for managed_node3 7554 1726853175.38809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853175.40006: done with get_vars() 7554 1726853175.40025: done getting variables 7554 1726853175.40064: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853175.40131: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 13:26:15 -0400 (0:00:00.081) 0:00:29.369 ****** 7554 1726853175.40153: entering _queue_task() for managed_node3/command 7554 1726853175.40708: worker is 1 (out of 1 available) 7554 1726853175.40719: exiting _queue_task() for managed_node3/command 7554 1726853175.40729: done queuing things up, now waiting for results queue to drain 7554 1726853175.40731: waiting for pending results... 7554 1726853175.40892: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7554 1726853175.41073: in run() - task 02083763-bbaf-bdc3-98b6-000000000e09 7554 1726853175.41078: variable 'ansible_search_path' from source: unknown 7554 1726853175.41081: variable 'ansible_search_path' from source: unknown 7554 1726853175.41085: calling self._execute() 7554 1726853175.41205: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853175.41219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853175.41236: variable 'omit' from source: magic vars 7554 1726853175.41789: variable 'ansible_distribution_major_version' from source: facts 7554 1726853175.41793: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853175.41945: variable 'type' from source: play vars 7554 1726853175.41958: variable 'state' from source: include params 7554 1726853175.41970: variable 'interface' from source: play vars 7554 1726853175.41985: variable 'current_interfaces' from source: set_fact 7554 1726853175.41999: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7554 1726853175.42006: when evaluation is False, skipping this task 7554 1726853175.42018: _execute() done 7554 1726853175.42026: dumping result to json 7554 1726853175.42033: done dumping result, returning 7554 1726853175.42045: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [02083763-bbaf-bdc3-98b6-000000000e09] 7554 1726853175.42055: sending task result for task 02083763-bbaf-bdc3-98b6-000000000e09 7554 1726853175.42310: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000e09 7554 1726853175.42314: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853175.42367: no more pending results, returning what we have 7554 1726853175.42373: results queue empty 7554 1726853175.42374: checking for any_errors_fatal 7554 1726853175.42381: done checking for any_errors_fatal 7554 1726853175.42382: checking for max_fail_percentage 7554 1726853175.42383: done checking for max_fail_percentage 7554 1726853175.42384: checking to see if all hosts have failed and the running result is not ok 7554 1726853175.42386: done checking to see if all hosts have failed 7554 1726853175.42386: getting the remaining hosts for this loop 7554 1726853175.42388: done getting the remaining hosts for this loop 7554 1726853175.42391: getting the next task for host managed_node3 7554 1726853175.42400: done getting next task for host managed_node3 7554 1726853175.42403: ^ task is: TASK: TEST: I can configure an interface with auto_gateway disabled 7554 1726853175.42406: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853175.42412: getting variables 7554 1726853175.42414: in VariableManager get_vars() 7554 1726853175.42574: Calling all_inventory to load vars for managed_node3 7554 1726853175.42578: Calling groups_inventory to load vars for managed_node3 7554 1726853175.42581: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853175.42591: Calling all_plugins_play to load vars for managed_node3 7554 1726853175.42594: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853175.42597: Calling groups_plugins_play to load vars for managed_node3 7554 1726853175.44035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853175.45620: done with get_vars() 7554 1726853175.45651: done getting variables 7554 1726853175.45717: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with auto_gateway disabled] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:83 Friday 20 September 2024 13:26:15 -0400 (0:00:00.055) 0:00:29.425 ****** 7554 1726853175.45754: entering _queue_task() for managed_node3/debug 7554 1726853175.46111: worker is 1 (out of 1 available) 7554 1726853175.46125: exiting _queue_task() for managed_node3/debug 7554 1726853175.46138: done queuing things up, now waiting for results queue to drain 7554 1726853175.46139: waiting for pending results... 7554 1726853175.46328: running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway disabled 7554 1726853175.46398: in run() - task 02083763-bbaf-bdc3-98b6-0000000000af 7554 1726853175.46409: variable 'ansible_search_path' from source: unknown 7554 1726853175.46438: calling self._execute() 7554 1726853175.46526: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853175.46530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853175.46538: variable 'omit' from source: magic vars 7554 1726853175.46821: variable 'ansible_distribution_major_version' from source: facts 7554 1726853175.46833: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853175.46840: variable 'omit' from source: magic vars 7554 1726853175.46856: variable 'omit' from source: magic vars 7554 1726853175.46886: variable 'omit' from source: magic vars 7554 1726853175.46916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853175.46947: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853175.46962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853175.46982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853175.46994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853175.47021: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853175.47024: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853175.47027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853175.47103: Set connection var ansible_shell_executable to /bin/sh 7554 1726853175.47110: Set connection var ansible_pipelining to False 7554 1726853175.47113: Set connection var ansible_shell_type to sh 7554 1726853175.47115: Set connection var ansible_connection to ssh 7554 1726853175.47123: Set connection var ansible_timeout to 10 7554 1726853175.47128: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853175.47150: variable 'ansible_shell_executable' from source: unknown 7554 1726853175.47155: variable 'ansible_connection' from source: unknown 7554 1726853175.47158: variable 'ansible_module_compression' from source: unknown 7554 1726853175.47161: variable 'ansible_shell_type' from source: unknown 7554 1726853175.47163: variable 'ansible_shell_executable' from source: unknown 7554 1726853175.47165: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853175.47167: variable 'ansible_pipelining' from source: unknown 7554 1726853175.47170: variable 'ansible_timeout' from source: unknown 7554 1726853175.47174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853175.47277: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853175.47288: variable 'omit' from source: magic vars 7554 1726853175.47292: starting attempt loop 7554 1726853175.47294: running the handler 7554 1726853175.47334: handler run complete 7554 1726853175.47351: attempt loop complete, returning result 7554 1726853175.47354: _execute() done 7554 1726853175.47356: dumping result to json 7554 1726853175.47359: done dumping result, returning 7554 1726853175.47365: done running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway disabled [02083763-bbaf-bdc3-98b6-0000000000af] 7554 1726853175.47370: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000af 7554 1726853175.47457: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000af 7554 1726853175.47459: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 7554 1726853175.47508: no more pending results, returning what we have 7554 1726853175.47511: results queue empty 7554 1726853175.47511: checking for any_errors_fatal 7554 1726853175.47516: done checking for any_errors_fatal 7554 1726853175.47517: checking for max_fail_percentage 7554 1726853175.47518: done checking for max_fail_percentage 7554 1726853175.47519: checking to see if all hosts have failed and the running result is not ok 7554 1726853175.47520: done checking to see if all hosts have failed 7554 1726853175.47521: getting the remaining hosts for this loop 7554 1726853175.47522: done getting the remaining hosts for this loop 7554 1726853175.47526: getting the next task for host managed_node3 7554 1726853175.47531: done getting next task for host managed_node3 7554 1726853175.47534: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7554 1726853175.47537: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853175.47540: getting variables 7554 1726853175.47542: in VariableManager get_vars() 7554 1726853175.47596: Calling all_inventory to load vars for managed_node3 7554 1726853175.47598: Calling groups_inventory to load vars for managed_node3 7554 1726853175.47601: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853175.47610: Calling all_plugins_play to load vars for managed_node3 7554 1726853175.47612: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853175.47615: Calling groups_plugins_play to load vars for managed_node3 7554 1726853175.48491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853175.49345: done with get_vars() 7554 1726853175.49360: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:87 Friday 20 September 2024 13:26:15 -0400 (0:00:00.036) 0:00:29.461 ****** 7554 1726853175.49428: entering _queue_task() for managed_node3/include_tasks 7554 1726853175.49662: worker is 1 (out of 1 available) 7554 1726853175.49677: exiting _queue_task() for managed_node3/include_tasks 7554 1726853175.49690: done queuing things up, now waiting for results queue to drain 7554 1726853175.49692: waiting for pending results... 7554 1726853175.49879: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7554 1726853175.49949: in run() - task 02083763-bbaf-bdc3-98b6-0000000000b0 7554 1726853175.49962: variable 'ansible_search_path' from source: unknown 7554 1726853175.49992: calling self._execute() 7554 1726853175.50076: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853175.50083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853175.50092: variable 'omit' from source: magic vars 7554 1726853175.50373: variable 'ansible_distribution_major_version' from source: facts 7554 1726853175.50384: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853175.50389: _execute() done 7554 1726853175.50393: dumping result to json 7554 1726853175.50396: done dumping result, returning 7554 1726853175.50403: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [02083763-bbaf-bdc3-98b6-0000000000b0] 7554 1726853175.50408: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000b0 7554 1726853175.50503: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000b0 7554 1726853175.50506: WORKER PROCESS EXITING 7554 1726853175.50531: no more pending results, returning what we have 7554 1726853175.50536: in VariableManager get_vars() 7554 1726853175.50589: Calling all_inventory to load vars for managed_node3 7554 1726853175.50592: Calling groups_inventory to load vars for managed_node3 7554 1726853175.50594: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853175.50606: Calling all_plugins_play to load vars for managed_node3 7554 1726853175.50608: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853175.50611: Calling groups_plugins_play to load vars for managed_node3 7554 1726853175.51377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853175.52327: done with get_vars() 7554 1726853175.52340: variable 'ansible_search_path' from source: unknown 7554 1726853175.52351: we have included files to process 7554 1726853175.52351: generating all_blocks data 7554 1726853175.52353: done generating all_blocks data 7554 1726853175.52357: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7554 1726853175.52358: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7554 1726853175.52359: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7554 1726853175.52609: in VariableManager get_vars() 7554 1726853175.52630: done with get_vars() 7554 1726853175.53043: done processing included file 7554 1726853175.53045: iterating over new_blocks loaded from include file 7554 1726853175.53046: in VariableManager get_vars() 7554 1726853175.53060: done with get_vars() 7554 1726853175.53061: filtering new block on tags 7554 1726853175.53083: done filtering new block on tags 7554 1726853175.53085: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7554 1726853175.53089: extending task lists for all hosts with included blocks 7554 1726853175.55835: done extending task lists 7554 1726853175.55836: done processing included files 7554 1726853175.55837: results queue empty 7554 1726853175.55838: checking for any_errors_fatal 7554 1726853175.55840: done checking for any_errors_fatal 7554 1726853175.55840: checking for max_fail_percentage 7554 1726853175.55841: done checking for max_fail_percentage 7554 1726853175.55842: checking to see if all hosts have failed and the running result is not ok 7554 1726853175.55843: done checking to see if all hosts have failed 7554 1726853175.55843: getting the remaining hosts for this loop 7554 1726853175.55844: done getting the remaining hosts for this loop 7554 1726853175.55846: getting the next task for host managed_node3 7554 1726853175.55848: done getting next task for host managed_node3 7554 1726853175.55850: ^ task is: TASK: Ensure state in ["present", "absent"] 7554 1726853175.55851: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853175.55853: getting variables 7554 1726853175.55853: in VariableManager get_vars() 7554 1726853175.55867: Calling all_inventory to load vars for managed_node3 7554 1726853175.55869: Calling groups_inventory to load vars for managed_node3 7554 1726853175.55870: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853175.55877: Calling all_plugins_play to load vars for managed_node3 7554 1726853175.55878: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853175.55880: Calling groups_plugins_play to load vars for managed_node3 7554 1726853175.56520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853175.57467: done with get_vars() 7554 1726853175.57483: done getting variables 7554 1726853175.57513: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 13:26:15 -0400 (0:00:00.081) 0:00:29.542 ****** 7554 1726853175.57534: entering _queue_task() for managed_node3/fail 7554 1726853175.57799: worker is 1 (out of 1 available) 7554 1726853175.57812: exiting _queue_task() for managed_node3/fail 7554 1726853175.57825: done queuing things up, now waiting for results queue to drain 7554 1726853175.57827: waiting for pending results... 7554 1726853175.58021: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7554 1726853175.58091: in run() - task 02083763-bbaf-bdc3-98b6-0000000010aa 7554 1726853175.58108: variable 'ansible_search_path' from source: unknown 7554 1726853175.58112: variable 'ansible_search_path' from source: unknown 7554 1726853175.58137: calling self._execute() 7554 1726853175.58221: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853175.58227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853175.58234: variable 'omit' from source: magic vars 7554 1726853175.58539: variable 'ansible_distribution_major_version' from source: facts 7554 1726853175.58551: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853175.58641: variable 'state' from source: include params 7554 1726853175.58650: Evaluated conditional (state not in ["present", "absent"]): False 7554 1726853175.58653: when evaluation is False, skipping this task 7554 1726853175.58657: _execute() done 7554 1726853175.58659: dumping result to json 7554 1726853175.58662: done dumping result, returning 7554 1726853175.58672: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [02083763-bbaf-bdc3-98b6-0000000010aa] 7554 1726853175.58675: sending task result for task 02083763-bbaf-bdc3-98b6-0000000010aa 7554 1726853175.58753: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000010aa 7554 1726853175.58756: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7554 1726853175.58801: no more pending results, returning what we have 7554 1726853175.58804: results queue empty 7554 1726853175.58805: checking for any_errors_fatal 7554 1726853175.58806: done checking for any_errors_fatal 7554 1726853175.58807: checking for max_fail_percentage 7554 1726853175.58808: done checking for max_fail_percentage 7554 1726853175.58809: checking to see if all hosts have failed and the running result is not ok 7554 1726853175.58810: done checking to see if all hosts have failed 7554 1726853175.58811: getting the remaining hosts for this loop 7554 1726853175.58812: done getting the remaining hosts for this loop 7554 1726853175.58815: getting the next task for host managed_node3 7554 1726853175.58820: done getting next task for host managed_node3 7554 1726853175.58822: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7554 1726853175.58825: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853175.58829: getting variables 7554 1726853175.58831: in VariableManager get_vars() 7554 1726853175.58886: Calling all_inventory to load vars for managed_node3 7554 1726853175.58888: Calling groups_inventory to load vars for managed_node3 7554 1726853175.58891: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853175.58902: Calling all_plugins_play to load vars for managed_node3 7554 1726853175.58904: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853175.58906: Calling groups_plugins_play to load vars for managed_node3 7554 1726853175.59669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853175.61038: done with get_vars() 7554 1726853175.61058: done getting variables 7554 1726853175.61105: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 13:26:15 -0400 (0:00:00.035) 0:00:29.578 ****** 7554 1726853175.61127: entering _queue_task() for managed_node3/fail 7554 1726853175.61377: worker is 1 (out of 1 available) 7554 1726853175.61390: exiting _queue_task() for managed_node3/fail 7554 1726853175.61403: done queuing things up, now waiting for results queue to drain 7554 1726853175.61405: waiting for pending results... 7554 1726853175.61587: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7554 1726853175.61656: in run() - task 02083763-bbaf-bdc3-98b6-0000000010ab 7554 1726853175.61670: variable 'ansible_search_path' from source: unknown 7554 1726853175.61674: variable 'ansible_search_path' from source: unknown 7554 1726853175.61701: calling self._execute() 7554 1726853175.61783: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853175.61786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853175.61795: variable 'omit' from source: magic vars 7554 1726853175.62078: variable 'ansible_distribution_major_version' from source: facts 7554 1726853175.62089: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853175.62183: variable 'type' from source: play vars 7554 1726853175.62188: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7554 1726853175.62191: when evaluation is False, skipping this task 7554 1726853175.62193: _execute() done 7554 1726853175.62196: dumping result to json 7554 1726853175.62200: done dumping result, returning 7554 1726853175.62206: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [02083763-bbaf-bdc3-98b6-0000000010ab] 7554 1726853175.62212: sending task result for task 02083763-bbaf-bdc3-98b6-0000000010ab 7554 1726853175.62296: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000010ab 7554 1726853175.62299: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7554 1726853175.62345: no more pending results, returning what we have 7554 1726853175.62348: results queue empty 7554 1726853175.62349: checking for any_errors_fatal 7554 1726853175.62359: done checking for any_errors_fatal 7554 1726853175.62359: checking for max_fail_percentage 7554 1726853175.62361: done checking for max_fail_percentage 7554 1726853175.62361: checking to see if all hosts have failed and the running result is not ok 7554 1726853175.62362: done checking to see if all hosts have failed 7554 1726853175.62363: getting the remaining hosts for this loop 7554 1726853175.62364: done getting the remaining hosts for this loop 7554 1726853175.62367: getting the next task for host managed_node3 7554 1726853175.62376: done getting next task for host managed_node3 7554 1726853175.62378: ^ task is: TASK: Include the task 'show_interfaces.yml' 7554 1726853175.62381: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853175.62385: getting variables 7554 1726853175.62387: in VariableManager get_vars() 7554 1726853175.62431: Calling all_inventory to load vars for managed_node3 7554 1726853175.62433: Calling groups_inventory to load vars for managed_node3 7554 1726853175.62435: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853175.62448: Calling all_plugins_play to load vars for managed_node3 7554 1726853175.62450: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853175.62453: Calling groups_plugins_play to load vars for managed_node3 7554 1726853175.63953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853175.65591: done with get_vars() 7554 1726853175.65611: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 13:26:15 -0400 (0:00:00.045) 0:00:29.624 ****** 7554 1726853175.65709: entering _queue_task() for managed_node3/include_tasks 7554 1726853175.66038: worker is 1 (out of 1 available) 7554 1726853175.66052: exiting _queue_task() for managed_node3/include_tasks 7554 1726853175.66068: done queuing things up, now waiting for results queue to drain 7554 1726853175.66070: waiting for pending results... 7554 1726853175.66494: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7554 1726853175.66510: in run() - task 02083763-bbaf-bdc3-98b6-0000000010ac 7554 1726853175.66530: variable 'ansible_search_path' from source: unknown 7554 1726853175.66538: variable 'ansible_search_path' from source: unknown 7554 1726853175.66581: calling self._execute() 7554 1726853175.66714: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853175.66721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853175.66821: variable 'omit' from source: magic vars 7554 1726853175.67136: variable 'ansible_distribution_major_version' from source: facts 7554 1726853175.67165: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853175.67178: _execute() done 7554 1726853175.67185: dumping result to json 7554 1726853175.67192: done dumping result, returning 7554 1726853175.67201: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-bdc3-98b6-0000000010ac] 7554 1726853175.67213: sending task result for task 02083763-bbaf-bdc3-98b6-0000000010ac 7554 1726853175.67461: no more pending results, returning what we have 7554 1726853175.67466: in VariableManager get_vars() 7554 1726853175.67527: Calling all_inventory to load vars for managed_node3 7554 1726853175.67530: Calling groups_inventory to load vars for managed_node3 7554 1726853175.67532: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853175.67550: Calling all_plugins_play to load vars for managed_node3 7554 1726853175.67553: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853175.67556: Calling groups_plugins_play to load vars for managed_node3 7554 1726853175.68184: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000010ac 7554 1726853175.68188: WORKER PROCESS EXITING 7554 1726853175.69099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853175.70692: done with get_vars() 7554 1726853175.70721: variable 'ansible_search_path' from source: unknown 7554 1726853175.70723: variable 'ansible_search_path' from source: unknown 7554 1726853175.70764: we have included files to process 7554 1726853175.70765: generating all_blocks data 7554 1726853175.70767: done generating all_blocks data 7554 1726853175.70773: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7554 1726853175.70775: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7554 1726853175.70778: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7554 1726853175.70894: in VariableManager get_vars() 7554 1726853175.70934: done with get_vars() 7554 1726853175.71060: done processing included file 7554 1726853175.71062: iterating over new_blocks loaded from include file 7554 1726853175.71063: in VariableManager get_vars() 7554 1726853175.71090: done with get_vars() 7554 1726853175.71092: filtering new block on tags 7554 1726853175.71109: done filtering new block on tags 7554 1726853175.71112: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7554 1726853175.71117: extending task lists for all hosts with included blocks 7554 1726853175.71548: done extending task lists 7554 1726853175.71550: done processing included files 7554 1726853175.71551: results queue empty 7554 1726853175.71551: checking for any_errors_fatal 7554 1726853175.71555: done checking for any_errors_fatal 7554 1726853175.71555: checking for max_fail_percentage 7554 1726853175.71556: done checking for max_fail_percentage 7554 1726853175.71557: checking to see if all hosts have failed and the running result is not ok 7554 1726853175.71558: done checking to see if all hosts have failed 7554 1726853175.71559: getting the remaining hosts for this loop 7554 1726853175.71560: done getting the remaining hosts for this loop 7554 1726853175.71562: getting the next task for host managed_node3 7554 1726853175.71567: done getting next task for host managed_node3 7554 1726853175.71569: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7554 1726853175.71575: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853175.71578: getting variables 7554 1726853175.71579: in VariableManager get_vars() 7554 1726853175.71601: Calling all_inventory to load vars for managed_node3 7554 1726853175.71604: Calling groups_inventory to load vars for managed_node3 7554 1726853175.71606: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853175.71612: Calling all_plugins_play to load vars for managed_node3 7554 1726853175.71614: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853175.71617: Calling groups_plugins_play to load vars for managed_node3 7554 1726853175.72868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853175.74420: done with get_vars() 7554 1726853175.74447: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:26:15 -0400 (0:00:00.088) 0:00:29.712 ****** 7554 1726853175.74534: entering _queue_task() for managed_node3/include_tasks 7554 1726853175.74895: worker is 1 (out of 1 available) 7554 1726853175.74908: exiting _queue_task() for managed_node3/include_tasks 7554 1726853175.74920: done queuing things up, now waiting for results queue to drain 7554 1726853175.74922: waiting for pending results... 7554 1726853175.75234: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7554 1726853175.75378: in run() - task 02083763-bbaf-bdc3-98b6-00000000130a 7554 1726853175.75404: variable 'ansible_search_path' from source: unknown 7554 1726853175.75411: variable 'ansible_search_path' from source: unknown 7554 1726853175.75453: calling self._execute() 7554 1726853175.75561: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853175.75575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853175.75589: variable 'omit' from source: magic vars 7554 1726853175.75988: variable 'ansible_distribution_major_version' from source: facts 7554 1726853175.76005: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853175.76015: _execute() done 7554 1726853175.76023: dumping result to json 7554 1726853175.76052: done dumping result, returning 7554 1726853175.76055: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-bdc3-98b6-00000000130a] 7554 1726853175.76058: sending task result for task 02083763-bbaf-bdc3-98b6-00000000130a 7554 1726853175.76301: no more pending results, returning what we have 7554 1726853175.76307: in VariableManager get_vars() 7554 1726853175.76367: Calling all_inventory to load vars for managed_node3 7554 1726853175.76375: Calling groups_inventory to load vars for managed_node3 7554 1726853175.76378: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853175.76384: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000130a 7554 1726853175.76387: WORKER PROCESS EXITING 7554 1726853175.76400: Calling all_plugins_play to load vars for managed_node3 7554 1726853175.76403: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853175.76406: Calling groups_plugins_play to load vars for managed_node3 7554 1726853175.77966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853175.79574: done with get_vars() 7554 1726853175.79601: variable 'ansible_search_path' from source: unknown 7554 1726853175.79607: variable 'ansible_search_path' from source: unknown 7554 1726853175.79674: we have included files to process 7554 1726853175.79676: generating all_blocks data 7554 1726853175.79678: done generating all_blocks data 7554 1726853175.79679: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7554 1726853175.79680: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7554 1726853175.79682: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7554 1726853175.79974: done processing included file 7554 1726853175.79977: iterating over new_blocks loaded from include file 7554 1726853175.79978: in VariableManager get_vars() 7554 1726853175.80006: done with get_vars() 7554 1726853175.80008: filtering new block on tags 7554 1726853175.80024: done filtering new block on tags 7554 1726853175.80027: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7554 1726853175.80033: extending task lists for all hosts with included blocks 7554 1726853175.80206: done extending task lists 7554 1726853175.80208: done processing included files 7554 1726853175.80209: results queue empty 7554 1726853175.80209: checking for any_errors_fatal 7554 1726853175.80213: done checking for any_errors_fatal 7554 1726853175.80214: checking for max_fail_percentage 7554 1726853175.80215: done checking for max_fail_percentage 7554 1726853175.80216: checking to see if all hosts have failed and the running result is not ok 7554 1726853175.80217: done checking to see if all hosts have failed 7554 1726853175.80217: getting the remaining hosts for this loop 7554 1726853175.80220: done getting the remaining hosts for this loop 7554 1726853175.80222: getting the next task for host managed_node3 7554 1726853175.80227: done getting next task for host managed_node3 7554 1726853175.80229: ^ task is: TASK: Gather current interface info 7554 1726853175.80232: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853175.80234: getting variables 7554 1726853175.80235: in VariableManager get_vars() 7554 1726853175.80261: Calling all_inventory to load vars for managed_node3 7554 1726853175.80263: Calling groups_inventory to load vars for managed_node3 7554 1726853175.80266: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853175.80274: Calling all_plugins_play to load vars for managed_node3 7554 1726853175.80276: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853175.80279: Calling groups_plugins_play to load vars for managed_node3 7554 1726853175.81565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853175.83122: done with get_vars() 7554 1726853175.83152: done getting variables 7554 1726853175.83210: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:26:15 -0400 (0:00:00.087) 0:00:29.799 ****** 7554 1726853175.83248: entering _queue_task() for managed_node3/command 7554 1726853175.83715: worker is 1 (out of 1 available) 7554 1726853175.83726: exiting _queue_task() for managed_node3/command 7554 1726853175.83738: done queuing things up, now waiting for results queue to drain 7554 1726853175.83740: waiting for pending results... 7554 1726853175.83966: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7554 1726853175.84076: in run() - task 02083763-bbaf-bdc3-98b6-000000001341 7554 1726853175.84096: variable 'ansible_search_path' from source: unknown 7554 1726853175.84100: variable 'ansible_search_path' from source: unknown 7554 1726853175.84129: calling self._execute() 7554 1726853175.84216: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853175.84226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853175.84231: variable 'omit' from source: magic vars 7554 1726853175.84515: variable 'ansible_distribution_major_version' from source: facts 7554 1726853175.84526: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853175.84531: variable 'omit' from source: magic vars 7554 1726853175.84575: variable 'omit' from source: magic vars 7554 1726853175.84600: variable 'omit' from source: magic vars 7554 1726853175.84634: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853175.84664: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853175.84683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853175.84696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853175.84706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853175.84731: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853175.84735: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853175.84737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853175.84811: Set connection var ansible_shell_executable to /bin/sh 7554 1726853175.84818: Set connection var ansible_pipelining to False 7554 1726853175.84820: Set connection var ansible_shell_type to sh 7554 1726853175.84823: Set connection var ansible_connection to ssh 7554 1726853175.84830: Set connection var ansible_timeout to 10 7554 1726853175.84835: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853175.84856: variable 'ansible_shell_executable' from source: unknown 7554 1726853175.84859: variable 'ansible_connection' from source: unknown 7554 1726853175.84862: variable 'ansible_module_compression' from source: unknown 7554 1726853175.84864: variable 'ansible_shell_type' from source: unknown 7554 1726853175.84867: variable 'ansible_shell_executable' from source: unknown 7554 1726853175.84869: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853175.84873: variable 'ansible_pipelining' from source: unknown 7554 1726853175.84875: variable 'ansible_timeout' from source: unknown 7554 1726853175.84880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853175.84988: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853175.84992: variable 'omit' from source: magic vars 7554 1726853175.84998: starting attempt loop 7554 1726853175.85001: running the handler 7554 1726853175.85014: _low_level_execute_command(): starting 7554 1726853175.85022: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853175.85537: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853175.85541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853175.85545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853175.85549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853175.85600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853175.85603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853175.85606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853175.85689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853175.87379: stdout chunk (state=3): >>>/root <<< 7554 1726853175.87487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853175.87513: stderr chunk (state=3): >>><<< 7554 1726853175.87517: stdout chunk (state=3): >>><<< 7554 1726853175.87536: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853175.87550: _low_level_execute_command(): starting 7554 1726853175.87553: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853175.8753557-8678-104296744927510 `" && echo ansible-tmp-1726853175.8753557-8678-104296744927510="` echo /root/.ansible/tmp/ansible-tmp-1726853175.8753557-8678-104296744927510 `" ) && sleep 0' 7554 1726853175.87960: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853175.87976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853175.87980: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853175.88002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853175.88050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853175.88053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853175.88126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853175.90122: stdout chunk (state=3): >>>ansible-tmp-1726853175.8753557-8678-104296744927510=/root/.ansible/tmp/ansible-tmp-1726853175.8753557-8678-104296744927510 <<< 7554 1726853175.90255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853175.90259: stdout chunk (state=3): >>><<< 7554 1726853175.90264: stderr chunk (state=3): >>><<< 7554 1726853175.90377: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853175.8753557-8678-104296744927510=/root/.ansible/tmp/ansible-tmp-1726853175.8753557-8678-104296744927510 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853175.90382: variable 'ansible_module_compression' from source: unknown 7554 1726853175.90384: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853175.90387: variable 'ansible_facts' from source: unknown 7554 1726853175.90439: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853175.8753557-8678-104296744927510/AnsiballZ_command.py 7554 1726853175.90540: Sending initial data 7554 1726853175.90543: Sent initial data (154 bytes) 7554 1726853175.91105: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853175.91163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853175.91237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853175.92907: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 7554 1726853175.92909: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853175.92958: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853175.93011: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp_hay13x0 /root/.ansible/tmp/ansible-tmp-1726853175.8753557-8678-104296744927510/AnsiballZ_command.py <<< 7554 1726853175.93020: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853175.8753557-8678-104296744927510/AnsiballZ_command.py" <<< 7554 1726853175.93067: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp_hay13x0" to remote "/root/.ansible/tmp/ansible-tmp-1726853175.8753557-8678-104296744927510/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853175.8753557-8678-104296744927510/AnsiballZ_command.py" <<< 7554 1726853175.93681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853175.93698: stderr chunk (state=3): >>><<< 7554 1726853175.93705: stdout chunk (state=3): >>><<< 7554 1726853175.93744: done transferring module to remote 7554 1726853175.93757: _low_level_execute_command(): starting 7554 1726853175.93760: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853175.8753557-8678-104296744927510/ /root/.ansible/tmp/ansible-tmp-1726853175.8753557-8678-104296744927510/AnsiballZ_command.py && sleep 0' 7554 1726853175.94387: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853175.94445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853175.94469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853175.94500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853175.94591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853175.96498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853175.96501: stderr chunk (state=3): >>><<< 7554 1726853175.96503: stdout chunk (state=3): >>><<< 7554 1726853175.96518: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853175.96595: _low_level_execute_command(): starting 7554 1726853175.96599: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853175.8753557-8678-104296744927510/AnsiballZ_command.py && sleep 0' 7554 1726853175.97110: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853175.97126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853175.97139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853175.97155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853175.97239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853175.97269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853175.97289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853175.97301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853175.97449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853176.13525: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:26:16.130382", "end": "2024-09-20 13:26:16.133787", "delta": "0:00:00.003405", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853176.15401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853176.15405: stdout chunk (state=3): >>><<< 7554 1726853176.15408: stderr chunk (state=3): >>><<< 7554 1726853176.15410: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:26:16.130382", "end": "2024-09-20 13:26:16.133787", "delta": "0:00:00.003405", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853176.15633: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853175.8753557-8678-104296744927510/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853176.15638: _low_level_execute_command(): starting 7554 1726853176.15640: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853175.8753557-8678-104296744927510/ > /dev/null 2>&1 && sleep 0' 7554 1726853176.16398: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853176.16417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853176.16436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853176.16457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853176.16478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853176.16535: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853176.16598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853176.16623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853176.16649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853176.16740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853176.18744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853176.19096: stderr chunk (state=3): >>><<< 7554 1726853176.19100: stdout chunk (state=3): >>><<< 7554 1726853176.19103: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853176.19107: handler run complete 7554 1726853176.19141: Evaluated conditional (False): False 7554 1726853176.19146: attempt loop complete, returning result 7554 1726853176.19148: _execute() done 7554 1726853176.19151: dumping result to json 7554 1726853176.19153: done dumping result, returning 7554 1726853176.19155: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [02083763-bbaf-bdc3-98b6-000000001341] 7554 1726853176.19157: sending task result for task 02083763-bbaf-bdc3-98b6-000000001341 7554 1726853176.19250: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001341 7554 1726853176.19254: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003405", "end": "2024-09-20 13:26:16.133787", "rc": 0, "start": "2024-09-20 13:26:16.130382" } STDOUT: eth0 lo 7554 1726853176.19331: no more pending results, returning what we have 7554 1726853176.19334: results queue empty 7554 1726853176.19335: checking for any_errors_fatal 7554 1726853176.19336: done checking for any_errors_fatal 7554 1726853176.19337: checking for max_fail_percentage 7554 1726853176.19339: done checking for max_fail_percentage 7554 1726853176.19339: checking to see if all hosts have failed and the running result is not ok 7554 1726853176.19340: done checking to see if all hosts have failed 7554 1726853176.19343: getting the remaining hosts for this loop 7554 1726853176.19345: done getting the remaining hosts for this loop 7554 1726853176.19348: getting the next task for host managed_node3 7554 1726853176.19355: done getting next task for host managed_node3 7554 1726853176.19357: ^ task is: TASK: Set current_interfaces 7554 1726853176.19362: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853176.19394: getting variables 7554 1726853176.19396: in VariableManager get_vars() 7554 1726853176.19440: Calling all_inventory to load vars for managed_node3 7554 1726853176.19445: Calling groups_inventory to load vars for managed_node3 7554 1726853176.19447: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853176.19456: Calling all_plugins_play to load vars for managed_node3 7554 1726853176.19459: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853176.19461: Calling groups_plugins_play to load vars for managed_node3 7554 1726853176.20801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853176.24072: done with get_vars() 7554 1726853176.24097: done getting variables 7554 1726853176.24155: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:26:16 -0400 (0:00:00.409) 0:00:30.209 ****** 7554 1726853176.24192: entering _queue_task() for managed_node3/set_fact 7554 1726853176.24534: worker is 1 (out of 1 available) 7554 1726853176.24546: exiting _queue_task() for managed_node3/set_fact 7554 1726853176.24778: done queuing things up, now waiting for results queue to drain 7554 1726853176.24781: waiting for pending results... 7554 1726853176.24956: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7554 1726853176.25087: in run() - task 02083763-bbaf-bdc3-98b6-000000001342 7554 1726853176.25102: variable 'ansible_search_path' from source: unknown 7554 1726853176.25105: variable 'ansible_search_path' from source: unknown 7554 1726853176.25140: calling self._execute() 7554 1726853176.25250: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853176.25257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853176.25275: variable 'omit' from source: magic vars 7554 1726853176.25839: variable 'ansible_distribution_major_version' from source: facts 7554 1726853176.25843: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853176.25845: variable 'omit' from source: magic vars 7554 1726853176.25847: variable 'omit' from source: magic vars 7554 1726853176.25912: variable '_current_interfaces' from source: set_fact 7554 1726853176.25981: variable 'omit' from source: magic vars 7554 1726853176.26020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853176.26112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853176.26116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853176.26118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853176.26129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853176.26329: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853176.26333: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853176.26336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853176.26338: Set connection var ansible_shell_executable to /bin/sh 7554 1726853176.26340: Set connection var ansible_pipelining to False 7554 1726853176.26343: Set connection var ansible_shell_type to sh 7554 1726853176.26345: Set connection var ansible_connection to ssh 7554 1726853176.26347: Set connection var ansible_timeout to 10 7554 1726853176.26349: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853176.26351: variable 'ansible_shell_executable' from source: unknown 7554 1726853176.26354: variable 'ansible_connection' from source: unknown 7554 1726853176.26356: variable 'ansible_module_compression' from source: unknown 7554 1726853176.26358: variable 'ansible_shell_type' from source: unknown 7554 1726853176.26360: variable 'ansible_shell_executable' from source: unknown 7554 1726853176.26362: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853176.26364: variable 'ansible_pipelining' from source: unknown 7554 1726853176.26367: variable 'ansible_timeout' from source: unknown 7554 1726853176.26369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853176.26656: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853176.26659: variable 'omit' from source: magic vars 7554 1726853176.26661: starting attempt loop 7554 1726853176.26663: running the handler 7554 1726853176.26664: handler run complete 7554 1726853176.26666: attempt loop complete, returning result 7554 1726853176.26667: _execute() done 7554 1726853176.26669: dumping result to json 7554 1726853176.26677: done dumping result, returning 7554 1726853176.26680: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [02083763-bbaf-bdc3-98b6-000000001342] 7554 1726853176.26682: sending task result for task 02083763-bbaf-bdc3-98b6-000000001342 7554 1726853176.26737: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001342 7554 1726853176.26740: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 7554 1726853176.26813: no more pending results, returning what we have 7554 1726853176.26816: results queue empty 7554 1726853176.26816: checking for any_errors_fatal 7554 1726853176.26824: done checking for any_errors_fatal 7554 1726853176.26825: checking for max_fail_percentage 7554 1726853176.26826: done checking for max_fail_percentage 7554 1726853176.26827: checking to see if all hosts have failed and the running result is not ok 7554 1726853176.26828: done checking to see if all hosts have failed 7554 1726853176.26828: getting the remaining hosts for this loop 7554 1726853176.26829: done getting the remaining hosts for this loop 7554 1726853176.26833: getting the next task for host managed_node3 7554 1726853176.26839: done getting next task for host managed_node3 7554 1726853176.26841: ^ task is: TASK: Show current_interfaces 7554 1726853176.26846: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853176.26849: getting variables 7554 1726853176.26850: in VariableManager get_vars() 7554 1726853176.26891: Calling all_inventory to load vars for managed_node3 7554 1726853176.26893: Calling groups_inventory to load vars for managed_node3 7554 1726853176.26895: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853176.26904: Calling all_plugins_play to load vars for managed_node3 7554 1726853176.26907: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853176.26909: Calling groups_plugins_play to load vars for managed_node3 7554 1726853176.28566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853176.30156: done with get_vars() 7554 1726853176.30183: done getting variables 7554 1726853176.30256: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:26:16 -0400 (0:00:00.060) 0:00:30.270 ****** 7554 1726853176.30290: entering _queue_task() for managed_node3/debug 7554 1726853176.30739: worker is 1 (out of 1 available) 7554 1726853176.30752: exiting _queue_task() for managed_node3/debug 7554 1726853176.30765: done queuing things up, now waiting for results queue to drain 7554 1726853176.30767: waiting for pending results... 7554 1726853176.31060: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7554 1726853176.31377: in run() - task 02083763-bbaf-bdc3-98b6-00000000130b 7554 1726853176.31381: variable 'ansible_search_path' from source: unknown 7554 1726853176.31384: variable 'ansible_search_path' from source: unknown 7554 1726853176.31388: calling self._execute() 7554 1726853176.31390: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853176.31393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853176.31395: variable 'omit' from source: magic vars 7554 1726853176.31729: variable 'ansible_distribution_major_version' from source: facts 7554 1726853176.31747: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853176.31759: variable 'omit' from source: magic vars 7554 1726853176.31808: variable 'omit' from source: magic vars 7554 1726853176.31913: variable 'current_interfaces' from source: set_fact 7554 1726853176.31950: variable 'omit' from source: magic vars 7554 1726853176.31995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853176.32033: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853176.32062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853176.32086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853176.32103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853176.32136: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853176.32144: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853176.32152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853176.32256: Set connection var ansible_shell_executable to /bin/sh 7554 1726853176.32277: Set connection var ansible_pipelining to False 7554 1726853176.32285: Set connection var ansible_shell_type to sh 7554 1726853176.32292: Set connection var ansible_connection to ssh 7554 1726853176.32306: Set connection var ansible_timeout to 10 7554 1726853176.32315: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853176.32338: variable 'ansible_shell_executable' from source: unknown 7554 1726853176.32346: variable 'ansible_connection' from source: unknown 7554 1726853176.32376: variable 'ansible_module_compression' from source: unknown 7554 1726853176.32379: variable 'ansible_shell_type' from source: unknown 7554 1726853176.32381: variable 'ansible_shell_executable' from source: unknown 7554 1726853176.32383: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853176.32385: variable 'ansible_pipelining' from source: unknown 7554 1726853176.32387: variable 'ansible_timeout' from source: unknown 7554 1726853176.32389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853176.32544: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853176.32594: variable 'omit' from source: magic vars 7554 1726853176.32597: starting attempt loop 7554 1726853176.32599: running the handler 7554 1726853176.32629: handler run complete 7554 1726853176.32646: attempt loop complete, returning result 7554 1726853176.32653: _execute() done 7554 1726853176.32659: dumping result to json 7554 1726853176.32665: done dumping result, returning 7554 1726853176.32702: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [02083763-bbaf-bdc3-98b6-00000000130b] 7554 1726853176.32705: sending task result for task 02083763-bbaf-bdc3-98b6-00000000130b ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo'] 7554 1726853176.32851: no more pending results, returning what we have 7554 1726853176.32854: results queue empty 7554 1726853176.32855: checking for any_errors_fatal 7554 1726853176.32862: done checking for any_errors_fatal 7554 1726853176.32863: checking for max_fail_percentage 7554 1726853176.32864: done checking for max_fail_percentage 7554 1726853176.32865: checking to see if all hosts have failed and the running result is not ok 7554 1726853176.32866: done checking to see if all hosts have failed 7554 1726853176.32867: getting the remaining hosts for this loop 7554 1726853176.32868: done getting the remaining hosts for this loop 7554 1726853176.32873: getting the next task for host managed_node3 7554 1726853176.32882: done getting next task for host managed_node3 7554 1726853176.32885: ^ task is: TASK: Install iproute 7554 1726853176.32889: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853176.32893: getting variables 7554 1726853176.32895: in VariableManager get_vars() 7554 1726853176.32944: Calling all_inventory to load vars for managed_node3 7554 1726853176.32947: Calling groups_inventory to load vars for managed_node3 7554 1726853176.32949: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853176.32960: Calling all_plugins_play to load vars for managed_node3 7554 1726853176.32963: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853176.32966: Calling groups_plugins_play to load vars for managed_node3 7554 1726853176.33891: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000130b 7554 1726853176.33895: WORKER PROCESS EXITING 7554 1726853176.34679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853176.36173: done with get_vars() 7554 1726853176.36200: done getting variables 7554 1726853176.36262: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 13:26:16 -0400 (0:00:00.060) 0:00:30.330 ****** 7554 1726853176.36297: entering _queue_task() for managed_node3/package 7554 1726853176.36643: worker is 1 (out of 1 available) 7554 1726853176.36657: exiting _queue_task() for managed_node3/package 7554 1726853176.36775: done queuing things up, now waiting for results queue to drain 7554 1726853176.36778: waiting for pending results... 7554 1726853176.36980: running TaskExecutor() for managed_node3/TASK: Install iproute 7554 1726853176.37097: in run() - task 02083763-bbaf-bdc3-98b6-0000000010ad 7554 1726853176.37124: variable 'ansible_search_path' from source: unknown 7554 1726853176.37133: variable 'ansible_search_path' from source: unknown 7554 1726853176.37180: calling self._execute() 7554 1726853176.37291: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853176.37301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853176.37312: variable 'omit' from source: magic vars 7554 1726853176.37670: variable 'ansible_distribution_major_version' from source: facts 7554 1726853176.37690: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853176.37699: variable 'omit' from source: magic vars 7554 1726853176.37738: variable 'omit' from source: magic vars 7554 1726853176.37921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853176.40022: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853176.40096: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853176.40138: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853176.40181: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853176.40212: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853176.40313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853176.40357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853176.40392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853176.40437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853176.40456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853176.40573: variable '__network_is_ostree' from source: set_fact 7554 1726853176.40586: variable 'omit' from source: magic vars 7554 1726853176.40627: variable 'omit' from source: magic vars 7554 1726853176.40661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853176.40695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853176.40725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853176.40747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853176.40762: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853176.40799: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853176.40811: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853176.40820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853176.40922: Set connection var ansible_shell_executable to /bin/sh 7554 1726853176.40936: Set connection var ansible_pipelining to False 7554 1726853176.40943: Set connection var ansible_shell_type to sh 7554 1726853176.40951: Set connection var ansible_connection to ssh 7554 1726853176.40965: Set connection var ansible_timeout to 10 7554 1726853176.40978: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853176.41028: variable 'ansible_shell_executable' from source: unknown 7554 1726853176.41031: variable 'ansible_connection' from source: unknown 7554 1726853176.41033: variable 'ansible_module_compression' from source: unknown 7554 1726853176.41035: variable 'ansible_shell_type' from source: unknown 7554 1726853176.41037: variable 'ansible_shell_executable' from source: unknown 7554 1726853176.41039: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853176.41041: variable 'ansible_pipelining' from source: unknown 7554 1726853176.41043: variable 'ansible_timeout' from source: unknown 7554 1726853176.41045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853176.41151: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853176.41177: variable 'omit' from source: magic vars 7554 1726853176.41180: starting attempt loop 7554 1726853176.41247: running the handler 7554 1726853176.41250: variable 'ansible_facts' from source: unknown 7554 1726853176.41253: variable 'ansible_facts' from source: unknown 7554 1726853176.41255: _low_level_execute_command(): starting 7554 1726853176.41257: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853176.41768: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853176.41789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853176.41802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853176.41844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853176.41862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853176.41929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853176.43675: stdout chunk (state=3): >>>/root <<< 7554 1726853176.43805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853176.43809: stdout chunk (state=3): >>><<< 7554 1726853176.43811: stderr chunk (state=3): >>><<< 7554 1726853176.43845: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853176.43913: _low_level_execute_command(): starting 7554 1726853176.43917: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853176.4386046-8706-69284515389157 `" && echo ansible-tmp-1726853176.4386046-8706-69284515389157="` echo /root/.ansible/tmp/ansible-tmp-1726853176.4386046-8706-69284515389157 `" ) && sleep 0' 7554 1726853176.44447: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853176.44451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853176.44482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853176.44487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853176.44500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853176.44539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853176.44545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853176.44611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853176.46635: stdout chunk (state=3): >>>ansible-tmp-1726853176.4386046-8706-69284515389157=/root/.ansible/tmp/ansible-tmp-1726853176.4386046-8706-69284515389157 <<< 7554 1726853176.46978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853176.47178: stderr chunk (state=3): >>><<< 7554 1726853176.47181: stdout chunk (state=3): >>><<< 7554 1726853176.47184: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853176.4386046-8706-69284515389157=/root/.ansible/tmp/ansible-tmp-1726853176.4386046-8706-69284515389157 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853176.47186: variable 'ansible_module_compression' from source: unknown 7554 1726853176.47188: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 7554 1726853176.47190: variable 'ansible_facts' from source: unknown 7554 1726853176.47549: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853176.4386046-8706-69284515389157/AnsiballZ_dnf.py 7554 1726853176.47992: Sending initial data 7554 1726853176.47996: Sent initial data (149 bytes) 7554 1726853176.50025: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853176.50191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853176.50250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853176.50297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853176.50361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853176.50420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853176.52523: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853176.52595: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpa1skrots /root/.ansible/tmp/ansible-tmp-1726853176.4386046-8706-69284515389157/AnsiballZ_dnf.py <<< 7554 1726853176.52599: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853176.4386046-8706-69284515389157/AnsiballZ_dnf.py" <<< 7554 1726853176.52652: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpa1skrots" to remote "/root/.ansible/tmp/ansible-tmp-1726853176.4386046-8706-69284515389157/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853176.4386046-8706-69284515389157/AnsiballZ_dnf.py" <<< 7554 1726853176.55120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853176.55164: stderr chunk (state=3): >>><<< 7554 1726853176.55167: stdout chunk (state=3): >>><<< 7554 1726853176.55213: done transferring module to remote 7554 1726853176.55223: _low_level_execute_command(): starting 7554 1726853176.55226: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853176.4386046-8706-69284515389157/ /root/.ansible/tmp/ansible-tmp-1726853176.4386046-8706-69284515389157/AnsiballZ_dnf.py && sleep 0' 7554 1726853176.56648: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853176.56658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853176.56660: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853176.56662: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853176.56664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853176.56796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853176.56822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853176.56895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853176.58798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853176.58802: stderr chunk (state=3): >>><<< 7554 1726853176.58804: stdout chunk (state=3): >>><<< 7554 1726853176.58836: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853176.58839: _low_level_execute_command(): starting 7554 1726853176.58842: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853176.4386046-8706-69284515389157/AnsiballZ_dnf.py && sleep 0' 7554 1726853176.60392: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853176.60525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853176.60642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.03135: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7554 1726853177.07475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853177.07503: stderr chunk (state=3): >>><<< 7554 1726853177.07506: stdout chunk (state=3): >>><<< 7554 1726853177.07520: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853177.07558: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853176.4386046-8706-69284515389157/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853177.07566: _low_level_execute_command(): starting 7554 1726853177.07568: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853176.4386046-8706-69284515389157/ > /dev/null 2>&1 && sleep 0' 7554 1726853177.08003: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.08006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.08008: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853177.08010: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.08012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.08061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853177.08065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.08127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.10002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853177.10023: stderr chunk (state=3): >>><<< 7554 1726853177.10027: stdout chunk (state=3): >>><<< 7554 1726853177.10038: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853177.10046: handler run complete 7554 1726853177.10160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853177.10287: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853177.10315: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853177.10340: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853177.10379: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853177.10427: variable '__install_status' from source: set_fact 7554 1726853177.10446: Evaluated conditional (__install_status is success): True 7554 1726853177.10456: attempt loop complete, returning result 7554 1726853177.10459: _execute() done 7554 1726853177.10461: dumping result to json 7554 1726853177.10466: done dumping result, returning 7554 1726853177.10474: done running TaskExecutor() for managed_node3/TASK: Install iproute [02083763-bbaf-bdc3-98b6-0000000010ad] 7554 1726853177.10480: sending task result for task 02083763-bbaf-bdc3-98b6-0000000010ad 7554 1726853177.10576: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000010ad 7554 1726853177.10579: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7554 1726853177.10657: no more pending results, returning what we have 7554 1726853177.10660: results queue empty 7554 1726853177.10660: checking for any_errors_fatal 7554 1726853177.10666: done checking for any_errors_fatal 7554 1726853177.10666: checking for max_fail_percentage 7554 1726853177.10668: done checking for max_fail_percentage 7554 1726853177.10669: checking to see if all hosts have failed and the running result is not ok 7554 1726853177.10670: done checking to see if all hosts have failed 7554 1726853177.10672: getting the remaining hosts for this loop 7554 1726853177.10674: done getting the remaining hosts for this loop 7554 1726853177.10677: getting the next task for host managed_node3 7554 1726853177.10683: done getting next task for host managed_node3 7554 1726853177.10685: ^ task is: TASK: Create veth interface {{ interface }} 7554 1726853177.10688: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853177.10692: getting variables 7554 1726853177.10694: in VariableManager get_vars() 7554 1726853177.10738: Calling all_inventory to load vars for managed_node3 7554 1726853177.10740: Calling groups_inventory to load vars for managed_node3 7554 1726853177.10745: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853177.10754: Calling all_plugins_play to load vars for managed_node3 7554 1726853177.10757: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853177.10759: Calling groups_plugins_play to load vars for managed_node3 7554 1726853177.11558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853177.12406: done with get_vars() 7554 1726853177.12423: done getting variables 7554 1726853177.12466: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853177.12554: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 13:26:17 -0400 (0:00:00.762) 0:00:31.093 ****** 7554 1726853177.12578: entering _queue_task() for managed_node3/command 7554 1726853177.12807: worker is 1 (out of 1 available) 7554 1726853177.12821: exiting _queue_task() for managed_node3/command 7554 1726853177.12833: done queuing things up, now waiting for results queue to drain 7554 1726853177.12835: waiting for pending results... 7554 1726853177.13020: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7554 1726853177.13090: in run() - task 02083763-bbaf-bdc3-98b6-0000000010ae 7554 1726853177.13103: variable 'ansible_search_path' from source: unknown 7554 1726853177.13107: variable 'ansible_search_path' from source: unknown 7554 1726853177.13304: variable 'interface' from source: play vars 7554 1726853177.13361: variable 'interface' from source: play vars 7554 1726853177.13415: variable 'interface' from source: play vars 7554 1726853177.13527: Loaded config def from plugin (lookup/items) 7554 1726853177.13534: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7554 1726853177.13554: variable 'omit' from source: magic vars 7554 1726853177.13650: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853177.13657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853177.13664: variable 'omit' from source: magic vars 7554 1726853177.13826: variable 'ansible_distribution_major_version' from source: facts 7554 1726853177.13829: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853177.13959: variable 'type' from source: play vars 7554 1726853177.13963: variable 'state' from source: include params 7554 1726853177.13965: variable 'interface' from source: play vars 7554 1726853177.13968: variable 'current_interfaces' from source: set_fact 7554 1726853177.13977: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7554 1726853177.13982: variable 'omit' from source: magic vars 7554 1726853177.14007: variable 'omit' from source: magic vars 7554 1726853177.14036: variable 'item' from source: unknown 7554 1726853177.14086: variable 'item' from source: unknown 7554 1726853177.14101: variable 'omit' from source: magic vars 7554 1726853177.14122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853177.14148: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853177.14163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853177.14178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853177.14186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853177.14210: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853177.14213: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853177.14215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853177.14287: Set connection var ansible_shell_executable to /bin/sh 7554 1726853177.14293: Set connection var ansible_pipelining to False 7554 1726853177.14296: Set connection var ansible_shell_type to sh 7554 1726853177.14299: Set connection var ansible_connection to ssh 7554 1726853177.14306: Set connection var ansible_timeout to 10 7554 1726853177.14311: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853177.14325: variable 'ansible_shell_executable' from source: unknown 7554 1726853177.14328: variable 'ansible_connection' from source: unknown 7554 1726853177.14330: variable 'ansible_module_compression' from source: unknown 7554 1726853177.14332: variable 'ansible_shell_type' from source: unknown 7554 1726853177.14335: variable 'ansible_shell_executable' from source: unknown 7554 1726853177.14337: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853177.14343: variable 'ansible_pipelining' from source: unknown 7554 1726853177.14346: variable 'ansible_timeout' from source: unknown 7554 1726853177.14348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853177.14440: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853177.14449: variable 'omit' from source: magic vars 7554 1726853177.14454: starting attempt loop 7554 1726853177.14456: running the handler 7554 1726853177.14469: _low_level_execute_command(): starting 7554 1726853177.14478: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853177.14961: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.14997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853177.15001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.15003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853177.15006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853177.15008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.15053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853177.15057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853177.15079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.15134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.16844: stdout chunk (state=3): >>>/root <<< 7554 1726853177.16939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853177.16968: stderr chunk (state=3): >>><<< 7554 1726853177.16973: stdout chunk (state=3): >>><<< 7554 1726853177.16994: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853177.17006: _low_level_execute_command(): starting 7554 1726853177.17012: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853177.1699398-8743-200392243402478 `" && echo ansible-tmp-1726853177.1699398-8743-200392243402478="` echo /root/.ansible/tmp/ansible-tmp-1726853177.1699398-8743-200392243402478 `" ) && sleep 0' 7554 1726853177.17458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.17461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.17463: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853177.17465: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853177.17467: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.17514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853177.17517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853177.17521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.17583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.19554: stdout chunk (state=3): >>>ansible-tmp-1726853177.1699398-8743-200392243402478=/root/.ansible/tmp/ansible-tmp-1726853177.1699398-8743-200392243402478 <<< 7554 1726853177.19662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853177.19689: stderr chunk (state=3): >>><<< 7554 1726853177.19693: stdout chunk (state=3): >>><<< 7554 1726853177.19710: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853177.1699398-8743-200392243402478=/root/.ansible/tmp/ansible-tmp-1726853177.1699398-8743-200392243402478 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853177.19736: variable 'ansible_module_compression' from source: unknown 7554 1726853177.19775: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853177.19805: variable 'ansible_facts' from source: unknown 7554 1726853177.19864: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853177.1699398-8743-200392243402478/AnsiballZ_command.py 7554 1726853177.19964: Sending initial data 7554 1726853177.19967: Sent initial data (154 bytes) 7554 1726853177.20407: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853177.20410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853177.20412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.20414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853177.20416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853177.20419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.20466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853177.20469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.20529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.22157: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 7554 1726853177.22164: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853177.22220: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853177.22277: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpj68q2wsl /root/.ansible/tmp/ansible-tmp-1726853177.1699398-8743-200392243402478/AnsiballZ_command.py <<< 7554 1726853177.22284: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853177.1699398-8743-200392243402478/AnsiballZ_command.py" <<< 7554 1726853177.22336: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpj68q2wsl" to remote "/root/.ansible/tmp/ansible-tmp-1726853177.1699398-8743-200392243402478/AnsiballZ_command.py" <<< 7554 1726853177.22339: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853177.1699398-8743-200392243402478/AnsiballZ_command.py" <<< 7554 1726853177.22921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853177.22963: stderr chunk (state=3): >>><<< 7554 1726853177.22966: stdout chunk (state=3): >>><<< 7554 1726853177.23002: done transferring module to remote 7554 1726853177.23011: _low_level_execute_command(): starting 7554 1726853177.23016: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853177.1699398-8743-200392243402478/ /root/.ansible/tmp/ansible-tmp-1726853177.1699398-8743-200392243402478/AnsiballZ_command.py && sleep 0' 7554 1726853177.23447: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853177.23450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853177.23452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.23455: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.23457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.23514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853177.23517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.23574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.25409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853177.25432: stderr chunk (state=3): >>><<< 7554 1726853177.25435: stdout chunk (state=3): >>><<< 7554 1726853177.25452: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853177.25455: _low_level_execute_command(): starting 7554 1726853177.25459: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853177.1699398-8743-200392243402478/AnsiballZ_command.py && sleep 0' 7554 1726853177.25859: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.25894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853177.25899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853177.25901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.25904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.25906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.25947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853177.25950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.26027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.42549: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 13:26:17.416034", "end": "2024-09-20 13:26:17.422676", "delta": "0:00:00.006642", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853177.44938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853177.44968: stderr chunk (state=3): >>><<< 7554 1726853177.44973: stdout chunk (state=3): >>><<< 7554 1726853177.44992: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 13:26:17.416034", "end": "2024-09-20 13:26:17.422676", "delta": "0:00:00.006642", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853177.45024: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853177.1699398-8743-200392243402478/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853177.45032: _low_level_execute_command(): starting 7554 1726853177.45035: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853177.1699398-8743-200392243402478/ > /dev/null 2>&1 && sleep 0' 7554 1726853177.45467: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853177.45478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853177.45501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.45504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.45506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.45560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853177.45567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.45632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.48760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853177.48788: stderr chunk (state=3): >>><<< 7554 1726853177.48791: stdout chunk (state=3): >>><<< 7554 1726853177.48805: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853177.48811: handler run complete 7554 1726853177.48828: Evaluated conditional (False): False 7554 1726853177.48838: attempt loop complete, returning result 7554 1726853177.48854: variable 'item' from source: unknown 7554 1726853177.48915: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.006642", "end": "2024-09-20 13:26:17.422676", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-20 13:26:17.416034" } 7554 1726853177.49101: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853177.49105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853177.49107: variable 'omit' from source: magic vars 7554 1726853177.49162: variable 'ansible_distribution_major_version' from source: facts 7554 1726853177.49167: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853177.49282: variable 'type' from source: play vars 7554 1726853177.49286: variable 'state' from source: include params 7554 1726853177.49288: variable 'interface' from source: play vars 7554 1726853177.49293: variable 'current_interfaces' from source: set_fact 7554 1726853177.49298: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7554 1726853177.49303: variable 'omit' from source: magic vars 7554 1726853177.49314: variable 'omit' from source: magic vars 7554 1726853177.49347: variable 'item' from source: unknown 7554 1726853177.49388: variable 'item' from source: unknown 7554 1726853177.49399: variable 'omit' from source: magic vars 7554 1726853177.49416: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853177.49424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853177.49430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853177.49445: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853177.49448: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853177.49450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853177.49498: Set connection var ansible_shell_executable to /bin/sh 7554 1726853177.49505: Set connection var ansible_pipelining to False 7554 1726853177.49507: Set connection var ansible_shell_type to sh 7554 1726853177.49510: Set connection var ansible_connection to ssh 7554 1726853177.49516: Set connection var ansible_timeout to 10 7554 1726853177.49521: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853177.49535: variable 'ansible_shell_executable' from source: unknown 7554 1726853177.49538: variable 'ansible_connection' from source: unknown 7554 1726853177.49540: variable 'ansible_module_compression' from source: unknown 7554 1726853177.49547: variable 'ansible_shell_type' from source: unknown 7554 1726853177.49549: variable 'ansible_shell_executable' from source: unknown 7554 1726853177.49551: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853177.49553: variable 'ansible_pipelining' from source: unknown 7554 1726853177.49556: variable 'ansible_timeout' from source: unknown 7554 1726853177.49557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853177.49622: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853177.49630: variable 'omit' from source: magic vars 7554 1726853177.49633: starting attempt loop 7554 1726853177.49635: running the handler 7554 1726853177.49645: _low_level_execute_command(): starting 7554 1726853177.49648: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853177.50108: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.50112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853177.50118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853177.50120: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.50123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.50169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853177.50178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853177.50180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.50245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.51920: stdout chunk (state=3): >>>/root <<< 7554 1726853177.52014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853177.52042: stderr chunk (state=3): >>><<< 7554 1726853177.52046: stdout chunk (state=3): >>><<< 7554 1726853177.52065: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853177.52074: _low_level_execute_command(): starting 7554 1726853177.52080: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853177.5206504-8743-49365889679481 `" && echo ansible-tmp-1726853177.5206504-8743-49365889679481="` echo /root/.ansible/tmp/ansible-tmp-1726853177.5206504-8743-49365889679481 `" ) && sleep 0' 7554 1726853177.52520: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853177.52524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.52526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853177.52528: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.52530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.52577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853177.52581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.52648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.54593: stdout chunk (state=3): >>>ansible-tmp-1726853177.5206504-8743-49365889679481=/root/.ansible/tmp/ansible-tmp-1726853177.5206504-8743-49365889679481 <<< 7554 1726853177.54701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853177.54726: stderr chunk (state=3): >>><<< 7554 1726853177.54729: stdout chunk (state=3): >>><<< 7554 1726853177.54746: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853177.5206504-8743-49365889679481=/root/.ansible/tmp/ansible-tmp-1726853177.5206504-8743-49365889679481 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853177.54766: variable 'ansible_module_compression' from source: unknown 7554 1726853177.54797: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853177.54812: variable 'ansible_facts' from source: unknown 7554 1726853177.54858: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853177.5206504-8743-49365889679481/AnsiballZ_command.py 7554 1726853177.54950: Sending initial data 7554 1726853177.54953: Sent initial data (153 bytes) 7554 1726853177.55372: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.55380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853177.55407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.55410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853177.55414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.55464: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853177.55472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.55533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.57158: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853177.57225: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853177.57422: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpji59tig_ /root/.ansible/tmp/ansible-tmp-1726853177.5206504-8743-49365889679481/AnsiballZ_command.py <<< 7554 1726853177.57445: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853177.5206504-8743-49365889679481/AnsiballZ_command.py" <<< 7554 1726853177.57515: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpji59tig_" to remote "/root/.ansible/tmp/ansible-tmp-1726853177.5206504-8743-49365889679481/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853177.5206504-8743-49365889679481/AnsiballZ_command.py" <<< 7554 1726853177.58534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853177.58657: stderr chunk (state=3): >>><<< 7554 1726853177.58660: stdout chunk (state=3): >>><<< 7554 1726853177.58662: done transferring module to remote 7554 1726853177.58664: _low_level_execute_command(): starting 7554 1726853177.58666: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853177.5206504-8743-49365889679481/ /root/.ansible/tmp/ansible-tmp-1726853177.5206504-8743-49365889679481/AnsiballZ_command.py && sleep 0' 7554 1726853177.59324: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853177.59435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853177.59461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.59563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.61450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853177.61476: stderr chunk (state=3): >>><<< 7554 1726853177.61481: stdout chunk (state=3): >>><<< 7554 1726853177.61498: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853177.61501: _low_level_execute_command(): starting 7554 1726853177.61506: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853177.5206504-8743-49365889679481/AnsiballZ_command.py && sleep 0' 7554 1726853177.62191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.62355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.78311: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 13:26:17.777600", "end": "2024-09-20 13:26:17.781655", "delta": "0:00:00.004055", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853177.79912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853177.79936: stderr chunk (state=3): >>><<< 7554 1726853177.79939: stdout chunk (state=3): >>><<< 7554 1726853177.79958: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 13:26:17.777600", "end": "2024-09-20 13:26:17.781655", "delta": "0:00:00.004055", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853177.79986: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853177.5206504-8743-49365889679481/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853177.79990: _low_level_execute_command(): starting 7554 1726853177.79996: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853177.5206504-8743-49365889679481/ > /dev/null 2>&1 && sleep 0' 7554 1726853177.80433: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853177.80437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853177.80439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.80443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.80446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.80494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853177.80501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.80558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.82470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853177.82475: stdout chunk (state=3): >>><<< 7554 1726853177.82478: stderr chunk (state=3): >>><<< 7554 1726853177.82548: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853177.82551: handler run complete 7554 1726853177.82553: Evaluated conditional (False): False 7554 1726853177.82555: attempt loop complete, returning result 7554 1726853177.82576: variable 'item' from source: unknown 7554 1726853177.82670: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.004055", "end": "2024-09-20 13:26:17.781655", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-20 13:26:17.777600" } 7554 1726853177.82792: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853177.82796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853177.82798: variable 'omit' from source: magic vars 7554 1726853177.82952: variable 'ansible_distribution_major_version' from source: facts 7554 1726853177.82955: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853177.83075: variable 'type' from source: play vars 7554 1726853177.83079: variable 'state' from source: include params 7554 1726853177.83084: variable 'interface' from source: play vars 7554 1726853177.83088: variable 'current_interfaces' from source: set_fact 7554 1726853177.83094: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7554 1726853177.83098: variable 'omit' from source: magic vars 7554 1726853177.83110: variable 'omit' from source: magic vars 7554 1726853177.83138: variable 'item' from source: unknown 7554 1726853177.83183: variable 'item' from source: unknown 7554 1726853177.83195: variable 'omit' from source: magic vars 7554 1726853177.83211: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853177.83217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853177.83224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853177.83233: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853177.83236: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853177.83240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853177.83292: Set connection var ansible_shell_executable to /bin/sh 7554 1726853177.83297: Set connection var ansible_pipelining to False 7554 1726853177.83300: Set connection var ansible_shell_type to sh 7554 1726853177.83302: Set connection var ansible_connection to ssh 7554 1726853177.83309: Set connection var ansible_timeout to 10 7554 1726853177.83314: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853177.83329: variable 'ansible_shell_executable' from source: unknown 7554 1726853177.83331: variable 'ansible_connection' from source: unknown 7554 1726853177.83334: variable 'ansible_module_compression' from source: unknown 7554 1726853177.83336: variable 'ansible_shell_type' from source: unknown 7554 1726853177.83338: variable 'ansible_shell_executable' from source: unknown 7554 1726853177.83340: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853177.83342: variable 'ansible_pipelining' from source: unknown 7554 1726853177.83350: variable 'ansible_timeout' from source: unknown 7554 1726853177.83353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853177.83414: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853177.83422: variable 'omit' from source: magic vars 7554 1726853177.83425: starting attempt loop 7554 1726853177.83427: running the handler 7554 1726853177.83433: _low_level_execute_command(): starting 7554 1726853177.83435: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853177.83849: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853177.83852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853177.83854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.83856: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.83859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.83909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853177.83914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.83975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.85658: stdout chunk (state=3): >>>/root <<< 7554 1726853177.85795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853177.85799: stdout chunk (state=3): >>><<< 7554 1726853177.85801: stderr chunk (state=3): >>><<< 7554 1726853177.85816: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853177.85831: _low_level_execute_command(): starting 7554 1726853177.85904: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853177.8582177-8743-97817766603068 `" && echo ansible-tmp-1726853177.8582177-8743-97817766603068="` echo /root/.ansible/tmp/ansible-tmp-1726853177.8582177-8743-97817766603068 `" ) && sleep 0' 7554 1726853177.86420: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853177.86434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853177.86450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.86467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853177.86486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853177.86497: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853177.86597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853177.86619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.86699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.88680: stdout chunk (state=3): >>>ansible-tmp-1726853177.8582177-8743-97817766603068=/root/.ansible/tmp/ansible-tmp-1726853177.8582177-8743-97817766603068 <<< 7554 1726853177.88826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853177.88837: stdout chunk (state=3): >>><<< 7554 1726853177.88849: stderr chunk (state=3): >>><<< 7554 1726853177.88878: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853177.8582177-8743-97817766603068=/root/.ansible/tmp/ansible-tmp-1726853177.8582177-8743-97817766603068 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853177.89076: variable 'ansible_module_compression' from source: unknown 7554 1726853177.89080: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853177.89082: variable 'ansible_facts' from source: unknown 7554 1726853177.89084: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853177.8582177-8743-97817766603068/AnsiballZ_command.py 7554 1726853177.89219: Sending initial data 7554 1726853177.89229: Sent initial data (153 bytes) 7554 1726853177.89798: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853177.89811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853177.89824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.89843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853177.89869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853177.89953: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853177.89982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853177.89997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.90098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.91727: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7554 1726853177.91757: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853177.91811: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853177.91886: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpp0s23n15 /root/.ansible/tmp/ansible-tmp-1726853177.8582177-8743-97817766603068/AnsiballZ_command.py <<< 7554 1726853177.91910: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853177.8582177-8743-97817766603068/AnsiballZ_command.py" <<< 7554 1726853177.91964: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpp0s23n15" to remote "/root/.ansible/tmp/ansible-tmp-1726853177.8582177-8743-97817766603068/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853177.8582177-8743-97817766603068/AnsiballZ_command.py" <<< 7554 1726853177.92856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853177.92859: stdout chunk (state=3): >>><<< 7554 1726853177.92861: stderr chunk (state=3): >>><<< 7554 1726853177.92869: done transferring module to remote 7554 1726853177.92883: _low_level_execute_command(): starting 7554 1726853177.92891: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853177.8582177-8743-97817766603068/ /root/.ansible/tmp/ansible-tmp-1726853177.8582177-8743-97817766603068/AnsiballZ_command.py && sleep 0' 7554 1726853177.93536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853177.93553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853177.93567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.93592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853177.93699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.93724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853177.93740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.93830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853177.95689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853177.95740: stderr chunk (state=3): >>><<< 7554 1726853177.95846: stdout chunk (state=3): >>><<< 7554 1726853177.95850: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853177.95852: _low_level_execute_command(): starting 7554 1726853177.95854: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853177.8582177-8743-97817766603068/AnsiballZ_command.py && sleep 0' 7554 1726853177.96407: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853177.96425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853177.96439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853177.96487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853177.96573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853177.96622: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853177.96659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853177.96754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853178.13206: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 13:26:18.123372", "end": "2024-09-20 13:26:18.127233", "delta": "0:00:00.003861", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853178.14629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853178.14633: stderr chunk (state=3): >>><<< 7554 1726853178.14658: stdout chunk (state=3): >>><<< 7554 1726853178.14679: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 13:26:18.123372", "end": "2024-09-20 13:26:18.127233", "delta": "0:00:00.003861", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853178.14848: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853177.8582177-8743-97817766603068/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853178.14851: _low_level_execute_command(): starting 7554 1726853178.14854: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853177.8582177-8743-97817766603068/ > /dev/null 2>&1 && sleep 0' 7554 1726853178.16094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853178.16221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853178.16250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853178.16267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853178.16384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853178.18317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853178.18329: stdout chunk (state=3): >>><<< 7554 1726853178.18342: stderr chunk (state=3): >>><<< 7554 1726853178.18582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853178.18586: handler run complete 7554 1726853178.18588: Evaluated conditional (False): False 7554 1726853178.18590: attempt loop complete, returning result 7554 1726853178.18593: variable 'item' from source: unknown 7554 1726853178.18647: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.003861", "end": "2024-09-20 13:26:18.127233", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-20 13:26:18.123372" } 7554 1726853178.19000: dumping result to json 7554 1726853178.19004: done dumping result, returning 7554 1726853178.19075: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [02083763-bbaf-bdc3-98b6-0000000010ae] 7554 1726853178.19080: sending task result for task 02083763-bbaf-bdc3-98b6-0000000010ae 7554 1726853178.19175: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000010ae 7554 1726853178.19178: WORKER PROCESS EXITING 7554 1726853178.19256: no more pending results, returning what we have 7554 1726853178.19265: results queue empty 7554 1726853178.19266: checking for any_errors_fatal 7554 1726853178.19275: done checking for any_errors_fatal 7554 1726853178.19276: checking for max_fail_percentage 7554 1726853178.19277: done checking for max_fail_percentage 7554 1726853178.19278: checking to see if all hosts have failed and the running result is not ok 7554 1726853178.19280: done checking to see if all hosts have failed 7554 1726853178.19280: getting the remaining hosts for this loop 7554 1726853178.19282: done getting the remaining hosts for this loop 7554 1726853178.19285: getting the next task for host managed_node3 7554 1726853178.19292: done getting next task for host managed_node3 7554 1726853178.19295: ^ task is: TASK: Set up veth as managed by NetworkManager 7554 1726853178.19298: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853178.19303: getting variables 7554 1726853178.19305: in VariableManager get_vars() 7554 1726853178.19355: Calling all_inventory to load vars for managed_node3 7554 1726853178.19358: Calling groups_inventory to load vars for managed_node3 7554 1726853178.19361: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853178.19678: Calling all_plugins_play to load vars for managed_node3 7554 1726853178.19683: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853178.19687: Calling groups_plugins_play to load vars for managed_node3 7554 1726853178.22795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853178.25818: done with get_vars() 7554 1726853178.25853: done getting variables 7554 1726853178.26118: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 13:26:18 -0400 (0:00:01.135) 0:00:32.229 ****** 7554 1726853178.26148: entering _queue_task() for managed_node3/command 7554 1726853178.26906: worker is 1 (out of 1 available) 7554 1726853178.26916: exiting _queue_task() for managed_node3/command 7554 1726853178.26926: done queuing things up, now waiting for results queue to drain 7554 1726853178.26928: waiting for pending results... 7554 1726853178.27490: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7554 1726853178.27495: in run() - task 02083763-bbaf-bdc3-98b6-0000000010af 7554 1726853178.27510: variable 'ansible_search_path' from source: unknown 7554 1726853178.27517: variable 'ansible_search_path' from source: unknown 7554 1726853178.27559: calling self._execute() 7554 1726853178.27878: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853178.27881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853178.27891: variable 'omit' from source: magic vars 7554 1726853178.28463: variable 'ansible_distribution_major_version' from source: facts 7554 1726853178.28877: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853178.28880: variable 'type' from source: play vars 7554 1726853178.28882: variable 'state' from source: include params 7554 1726853178.28885: Evaluated conditional (type == 'veth' and state == 'present'): True 7554 1726853178.28888: variable 'omit' from source: magic vars 7554 1726853178.29103: variable 'omit' from source: magic vars 7554 1726853178.29476: variable 'interface' from source: play vars 7554 1726853178.29480: variable 'omit' from source: magic vars 7554 1726853178.29521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853178.29576: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853178.29615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853178.29649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853178.29990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853178.29993: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853178.29996: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853178.29998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853178.30072: Set connection var ansible_shell_executable to /bin/sh 7554 1726853178.30090: Set connection var ansible_pipelining to False 7554 1726853178.30097: Set connection var ansible_shell_type to sh 7554 1726853178.30104: Set connection var ansible_connection to ssh 7554 1726853178.30119: Set connection var ansible_timeout to 10 7554 1726853178.30129: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853178.30160: variable 'ansible_shell_executable' from source: unknown 7554 1726853178.30482: variable 'ansible_connection' from source: unknown 7554 1726853178.30485: variable 'ansible_module_compression' from source: unknown 7554 1726853178.30488: variable 'ansible_shell_type' from source: unknown 7554 1726853178.30490: variable 'ansible_shell_executable' from source: unknown 7554 1726853178.30492: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853178.30494: variable 'ansible_pipelining' from source: unknown 7554 1726853178.30496: variable 'ansible_timeout' from source: unknown 7554 1726853178.30498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853178.30563: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853178.30876: variable 'omit' from source: magic vars 7554 1726853178.30879: starting attempt loop 7554 1726853178.30882: running the handler 7554 1726853178.30884: _low_level_execute_command(): starting 7554 1726853178.30886: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853178.32096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853178.32200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853178.32259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853178.32358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853178.34075: stdout chunk (state=3): >>>/root <<< 7554 1726853178.34189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853178.34304: stderr chunk (state=3): >>><<< 7554 1726853178.34308: stdout chunk (state=3): >>><<< 7554 1726853178.34334: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853178.34376: _low_level_execute_command(): starting 7554 1726853178.34388: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853178.3434267-8787-231536393919122 `" && echo ansible-tmp-1726853178.3434267-8787-231536393919122="` echo /root/.ansible/tmp/ansible-tmp-1726853178.3434267-8787-231536393919122 `" ) && sleep 0' 7554 1726853178.35588: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853178.35690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853178.35763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853178.35880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853178.37897: stdout chunk (state=3): >>>ansible-tmp-1726853178.3434267-8787-231536393919122=/root/.ansible/tmp/ansible-tmp-1726853178.3434267-8787-231536393919122 <<< 7554 1726853178.38217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853178.38396: stdout chunk (state=3): >>><<< 7554 1726853178.38399: stderr chunk (state=3): >>><<< 7554 1726853178.38468: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853178.3434267-8787-231536393919122=/root/.ansible/tmp/ansible-tmp-1726853178.3434267-8787-231536393919122 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853178.38474: variable 'ansible_module_compression' from source: unknown 7554 1726853178.38607: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853178.38720: variable 'ansible_facts' from source: unknown 7554 1726853178.38917: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853178.3434267-8787-231536393919122/AnsiballZ_command.py 7554 1726853178.39298: Sending initial data 7554 1726853178.39302: Sent initial data (154 bytes) 7554 1726853178.40350: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853178.40363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853178.40380: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853178.40644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853178.40786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853178.40879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853178.42545: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853178.42674: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853178.42734: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp4cqk3gso /root/.ansible/tmp/ansible-tmp-1726853178.3434267-8787-231536393919122/AnsiballZ_command.py <<< 7554 1726853178.42744: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853178.3434267-8787-231536393919122/AnsiballZ_command.py" <<< 7554 1726853178.42788: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp4cqk3gso" to remote "/root/.ansible/tmp/ansible-tmp-1726853178.3434267-8787-231536393919122/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853178.3434267-8787-231536393919122/AnsiballZ_command.py" <<< 7554 1726853178.44147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853178.44163: stdout chunk (state=3): >>><<< 7554 1726853178.44179: stderr chunk (state=3): >>><<< 7554 1726853178.44263: done transferring module to remote 7554 1726853178.44356: _low_level_execute_command(): starting 7554 1726853178.44361: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853178.3434267-8787-231536393919122/ /root/.ansible/tmp/ansible-tmp-1726853178.3434267-8787-231536393919122/AnsiballZ_command.py && sleep 0' 7554 1726853178.44934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853178.44954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853178.44973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853178.45047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853178.45096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853178.45120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853178.45137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853178.45232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853178.47161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853178.47170: stdout chunk (state=3): >>><<< 7554 1726853178.47306: stderr chunk (state=3): >>><<< 7554 1726853178.47309: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853178.47311: _low_level_execute_command(): starting 7554 1726853178.47314: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853178.3434267-8787-231536393919122/AnsiballZ_command.py && sleep 0' 7554 1726853178.47800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853178.47815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853178.47832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853178.47851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853178.47868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853178.47884: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853178.47898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853178.47986: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853178.48004: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853178.48021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853178.48119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853178.65901: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 13:26:18.639300", "end": "2024-09-20 13:26:18.657542", "delta": "0:00:00.018242", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853178.67557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853178.67604: stderr chunk (state=3): >>><<< 7554 1726853178.67613: stdout chunk (state=3): >>><<< 7554 1726853178.67637: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 13:26:18.639300", "end": "2024-09-20 13:26:18.657542", "delta": "0:00:00.018242", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853178.67699: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853178.3434267-8787-231536393919122/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853178.67725: _low_level_execute_command(): starting 7554 1726853178.67735: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853178.3434267-8787-231536393919122/ > /dev/null 2>&1 && sleep 0' 7554 1726853178.68318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853178.68345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853178.68376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853178.68403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853178.68417: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853178.68429: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853178.68455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853178.68491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853178.68535: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853178.68590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853178.68619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853178.68689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853178.70596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853178.70617: stderr chunk (state=3): >>><<< 7554 1726853178.70621: stdout chunk (state=3): >>><<< 7554 1726853178.70634: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853178.70646: handler run complete 7554 1726853178.70676: Evaluated conditional (False): False 7554 1726853178.70687: attempt loop complete, returning result 7554 1726853178.70690: _execute() done 7554 1726853178.70692: dumping result to json 7554 1726853178.70694: done dumping result, returning 7554 1726853178.70701: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [02083763-bbaf-bdc3-98b6-0000000010af] 7554 1726853178.70706: sending task result for task 02083763-bbaf-bdc3-98b6-0000000010af 7554 1726853178.70801: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000010af 7554 1726853178.70804: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.018242", "end": "2024-09-20 13:26:18.657542", "rc": 0, "start": "2024-09-20 13:26:18.639300" } 7554 1726853178.70865: no more pending results, returning what we have 7554 1726853178.70868: results queue empty 7554 1726853178.70869: checking for any_errors_fatal 7554 1726853178.70896: done checking for any_errors_fatal 7554 1726853178.70897: checking for max_fail_percentage 7554 1726853178.70899: done checking for max_fail_percentage 7554 1726853178.70900: checking to see if all hosts have failed and the running result is not ok 7554 1726853178.70901: done checking to see if all hosts have failed 7554 1726853178.70902: getting the remaining hosts for this loop 7554 1726853178.70903: done getting the remaining hosts for this loop 7554 1726853178.70906: getting the next task for host managed_node3 7554 1726853178.70914: done getting next task for host managed_node3 7554 1726853178.70916: ^ task is: TASK: Delete veth interface {{ interface }} 7554 1726853178.70919: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853178.70924: getting variables 7554 1726853178.70926: in VariableManager get_vars() 7554 1726853178.71122: Calling all_inventory to load vars for managed_node3 7554 1726853178.71125: Calling groups_inventory to load vars for managed_node3 7554 1726853178.71128: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853178.71138: Calling all_plugins_play to load vars for managed_node3 7554 1726853178.71144: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853178.71148: Calling groups_plugins_play to load vars for managed_node3 7554 1726853178.72368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853178.73325: done with get_vars() 7554 1726853178.73341: done getting variables 7554 1726853178.73386: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853178.73470: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 13:26:18 -0400 (0:00:00.473) 0:00:32.702 ****** 7554 1726853178.73494: entering _queue_task() for managed_node3/command 7554 1726853178.73718: worker is 1 (out of 1 available) 7554 1726853178.73732: exiting _queue_task() for managed_node3/command 7554 1726853178.73745: done queuing things up, now waiting for results queue to drain 7554 1726853178.73746: waiting for pending results... 7554 1726853178.73927: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7554 1726853178.74018: in run() - task 02083763-bbaf-bdc3-98b6-0000000010b0 7554 1726853178.74023: variable 'ansible_search_path' from source: unknown 7554 1726853178.74026: variable 'ansible_search_path' from source: unknown 7554 1726853178.74061: calling self._execute() 7554 1726853178.74181: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853178.74184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853178.74188: variable 'omit' from source: magic vars 7554 1726853178.74676: variable 'ansible_distribution_major_version' from source: facts 7554 1726853178.74679: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853178.74759: variable 'type' from source: play vars 7554 1726853178.74769: variable 'state' from source: include params 7554 1726853178.74781: variable 'interface' from source: play vars 7554 1726853178.74789: variable 'current_interfaces' from source: set_fact 7554 1726853178.74807: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 7554 1726853178.74815: when evaluation is False, skipping this task 7554 1726853178.74828: _execute() done 7554 1726853178.74834: dumping result to json 7554 1726853178.74846: done dumping result, returning 7554 1726853178.74857: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [02083763-bbaf-bdc3-98b6-0000000010b0] 7554 1726853178.74862: sending task result for task 02083763-bbaf-bdc3-98b6-0000000010b0 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853178.75003: no more pending results, returning what we have 7554 1726853178.75007: results queue empty 7554 1726853178.75007: checking for any_errors_fatal 7554 1726853178.75016: done checking for any_errors_fatal 7554 1726853178.75017: checking for max_fail_percentage 7554 1726853178.75019: done checking for max_fail_percentage 7554 1726853178.75020: checking to see if all hosts have failed and the running result is not ok 7554 1726853178.75022: done checking to see if all hosts have failed 7554 1726853178.75022: getting the remaining hosts for this loop 7554 1726853178.75024: done getting the remaining hosts for this loop 7554 1726853178.75028: getting the next task for host managed_node3 7554 1726853178.75033: done getting next task for host managed_node3 7554 1726853178.75035: ^ task is: TASK: Create dummy interface {{ interface }} 7554 1726853178.75039: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853178.75045: getting variables 7554 1726853178.75047: in VariableManager get_vars() 7554 1726853178.75106: Calling all_inventory to load vars for managed_node3 7554 1726853178.75108: Calling groups_inventory to load vars for managed_node3 7554 1726853178.75110: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853178.75116: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000010b0 7554 1726853178.75119: WORKER PROCESS EXITING 7554 1726853178.75127: Calling all_plugins_play to load vars for managed_node3 7554 1726853178.75129: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853178.75133: Calling groups_plugins_play to load vars for managed_node3 7554 1726853178.76328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853178.77191: done with get_vars() 7554 1726853178.77206: done getting variables 7554 1726853178.77247: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853178.77321: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 13:26:18 -0400 (0:00:00.038) 0:00:32.740 ****** 7554 1726853178.77345: entering _queue_task() for managed_node3/command 7554 1726853178.77548: worker is 1 (out of 1 available) 7554 1726853178.77561: exiting _queue_task() for managed_node3/command 7554 1726853178.77576: done queuing things up, now waiting for results queue to drain 7554 1726853178.77578: waiting for pending results... 7554 1726853178.77746: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7554 1726853178.77817: in run() - task 02083763-bbaf-bdc3-98b6-0000000010b1 7554 1726853178.77830: variable 'ansible_search_path' from source: unknown 7554 1726853178.77833: variable 'ansible_search_path' from source: unknown 7554 1726853178.77861: calling self._execute() 7554 1726853178.77932: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853178.77936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853178.78028: variable 'omit' from source: magic vars 7554 1726853178.78197: variable 'ansible_distribution_major_version' from source: facts 7554 1726853178.78212: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853178.78576: variable 'type' from source: play vars 7554 1726853178.78579: variable 'state' from source: include params 7554 1726853178.78582: variable 'interface' from source: play vars 7554 1726853178.78584: variable 'current_interfaces' from source: set_fact 7554 1726853178.78586: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7554 1726853178.78588: when evaluation is False, skipping this task 7554 1726853178.78591: _execute() done 7554 1726853178.78592: dumping result to json 7554 1726853178.78594: done dumping result, returning 7554 1726853178.78596: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [02083763-bbaf-bdc3-98b6-0000000010b1] 7554 1726853178.78598: sending task result for task 02083763-bbaf-bdc3-98b6-0000000010b1 7554 1726853178.78658: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000010b1 7554 1726853178.78661: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853178.78709: no more pending results, returning what we have 7554 1726853178.78712: results queue empty 7554 1726853178.78713: checking for any_errors_fatal 7554 1726853178.78719: done checking for any_errors_fatal 7554 1726853178.78720: checking for max_fail_percentage 7554 1726853178.78721: done checking for max_fail_percentage 7554 1726853178.78722: checking to see if all hosts have failed and the running result is not ok 7554 1726853178.78723: done checking to see if all hosts have failed 7554 1726853178.78724: getting the remaining hosts for this loop 7554 1726853178.78725: done getting the remaining hosts for this loop 7554 1726853178.78729: getting the next task for host managed_node3 7554 1726853178.78735: done getting next task for host managed_node3 7554 1726853178.78737: ^ task is: TASK: Delete dummy interface {{ interface }} 7554 1726853178.78741: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853178.78748: getting variables 7554 1726853178.78749: in VariableManager get_vars() 7554 1726853178.78797: Calling all_inventory to load vars for managed_node3 7554 1726853178.78800: Calling groups_inventory to load vars for managed_node3 7554 1726853178.78802: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853178.78814: Calling all_plugins_play to load vars for managed_node3 7554 1726853178.78817: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853178.78820: Calling groups_plugins_play to load vars for managed_node3 7554 1726853178.83165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853178.84000: done with get_vars() 7554 1726853178.84016: done getting variables 7554 1726853178.84051: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853178.84118: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 13:26:18 -0400 (0:00:00.067) 0:00:32.808 ****** 7554 1726853178.84135: entering _queue_task() for managed_node3/command 7554 1726853178.84383: worker is 1 (out of 1 available) 7554 1726853178.84397: exiting _queue_task() for managed_node3/command 7554 1726853178.84409: done queuing things up, now waiting for results queue to drain 7554 1726853178.84411: waiting for pending results... 7554 1726853178.84596: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7554 1726853178.84670: in run() - task 02083763-bbaf-bdc3-98b6-0000000010b2 7554 1726853178.84684: variable 'ansible_search_path' from source: unknown 7554 1726853178.84687: variable 'ansible_search_path' from source: unknown 7554 1726853178.84715: calling self._execute() 7554 1726853178.84791: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853178.84795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853178.84803: variable 'omit' from source: magic vars 7554 1726853178.85081: variable 'ansible_distribution_major_version' from source: facts 7554 1726853178.85087: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853178.85222: variable 'type' from source: play vars 7554 1726853178.85226: variable 'state' from source: include params 7554 1726853178.85229: variable 'interface' from source: play vars 7554 1726853178.85234: variable 'current_interfaces' from source: set_fact 7554 1726853178.85242: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7554 1726853178.85249: when evaluation is False, skipping this task 7554 1726853178.85252: _execute() done 7554 1726853178.85254: dumping result to json 7554 1726853178.85257: done dumping result, returning 7554 1726853178.85263: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [02083763-bbaf-bdc3-98b6-0000000010b2] 7554 1726853178.85268: sending task result for task 02083763-bbaf-bdc3-98b6-0000000010b2 7554 1726853178.85353: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000010b2 7554 1726853178.85356: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853178.85431: no more pending results, returning what we have 7554 1726853178.85434: results queue empty 7554 1726853178.85435: checking for any_errors_fatal 7554 1726853178.85440: done checking for any_errors_fatal 7554 1726853178.85441: checking for max_fail_percentage 7554 1726853178.85442: done checking for max_fail_percentage 7554 1726853178.85443: checking to see if all hosts have failed and the running result is not ok 7554 1726853178.85444: done checking to see if all hosts have failed 7554 1726853178.85444: getting the remaining hosts for this loop 7554 1726853178.85446: done getting the remaining hosts for this loop 7554 1726853178.85449: getting the next task for host managed_node3 7554 1726853178.85454: done getting next task for host managed_node3 7554 1726853178.85456: ^ task is: TASK: Create tap interface {{ interface }} 7554 1726853178.85459: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853178.85463: getting variables 7554 1726853178.85466: in VariableManager get_vars() 7554 1726853178.85508: Calling all_inventory to load vars for managed_node3 7554 1726853178.85510: Calling groups_inventory to load vars for managed_node3 7554 1726853178.85512: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853178.85521: Calling all_plugins_play to load vars for managed_node3 7554 1726853178.85524: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853178.85526: Calling groups_plugins_play to load vars for managed_node3 7554 1726853178.86242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853178.87093: done with get_vars() 7554 1726853178.87108: done getting variables 7554 1726853178.87146: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853178.87219: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 13:26:18 -0400 (0:00:00.031) 0:00:32.839 ****** 7554 1726853178.87239: entering _queue_task() for managed_node3/command 7554 1726853178.87438: worker is 1 (out of 1 available) 7554 1726853178.87451: exiting _queue_task() for managed_node3/command 7554 1726853178.87463: done queuing things up, now waiting for results queue to drain 7554 1726853178.87465: waiting for pending results... 7554 1726853178.87629: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7554 1726853178.87698: in run() - task 02083763-bbaf-bdc3-98b6-0000000010b3 7554 1726853178.87705: variable 'ansible_search_path' from source: unknown 7554 1726853178.87708: variable 'ansible_search_path' from source: unknown 7554 1726853178.87735: calling self._execute() 7554 1726853178.87809: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853178.87813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853178.87823: variable 'omit' from source: magic vars 7554 1726853178.88075: variable 'ansible_distribution_major_version' from source: facts 7554 1726853178.88084: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853178.88216: variable 'type' from source: play vars 7554 1726853178.88220: variable 'state' from source: include params 7554 1726853178.88232: variable 'interface' from source: play vars 7554 1726853178.88235: variable 'current_interfaces' from source: set_fact 7554 1726853178.88238: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7554 1726853178.88242: when evaluation is False, skipping this task 7554 1726853178.88244: _execute() done 7554 1726853178.88247: dumping result to json 7554 1726853178.88249: done dumping result, returning 7554 1726853178.88257: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [02083763-bbaf-bdc3-98b6-0000000010b3] 7554 1726853178.88259: sending task result for task 02083763-bbaf-bdc3-98b6-0000000010b3 7554 1726853178.88338: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000010b3 7554 1726853178.88340: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853178.88400: no more pending results, returning what we have 7554 1726853178.88403: results queue empty 7554 1726853178.88403: checking for any_errors_fatal 7554 1726853178.88408: done checking for any_errors_fatal 7554 1726853178.88408: checking for max_fail_percentage 7554 1726853178.88410: done checking for max_fail_percentage 7554 1726853178.88410: checking to see if all hosts have failed and the running result is not ok 7554 1726853178.88411: done checking to see if all hosts have failed 7554 1726853178.88412: getting the remaining hosts for this loop 7554 1726853178.88413: done getting the remaining hosts for this loop 7554 1726853178.88416: getting the next task for host managed_node3 7554 1726853178.88420: done getting next task for host managed_node3 7554 1726853178.88422: ^ task is: TASK: Delete tap interface {{ interface }} 7554 1726853178.88424: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853178.88428: getting variables 7554 1726853178.88429: in VariableManager get_vars() 7554 1726853178.88467: Calling all_inventory to load vars for managed_node3 7554 1726853178.88470: Calling groups_inventory to load vars for managed_node3 7554 1726853178.88473: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853178.88482: Calling all_plugins_play to load vars for managed_node3 7554 1726853178.88484: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853178.88486: Calling groups_plugins_play to load vars for managed_node3 7554 1726853178.89304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853178.90144: done with get_vars() 7554 1726853178.90158: done getting variables 7554 1726853178.90196: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853178.90267: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 13:26:18 -0400 (0:00:00.030) 0:00:32.870 ****** 7554 1726853178.90289: entering _queue_task() for managed_node3/command 7554 1726853178.90473: worker is 1 (out of 1 available) 7554 1726853178.90488: exiting _queue_task() for managed_node3/command 7554 1726853178.90500: done queuing things up, now waiting for results queue to drain 7554 1726853178.90501: waiting for pending results... 7554 1726853178.90661: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7554 1726853178.90726: in run() - task 02083763-bbaf-bdc3-98b6-0000000010b4 7554 1726853178.90739: variable 'ansible_search_path' from source: unknown 7554 1726853178.90742: variable 'ansible_search_path' from source: unknown 7554 1726853178.90768: calling self._execute() 7554 1726853178.90840: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853178.90844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853178.90857: variable 'omit' from source: magic vars 7554 1726853178.91116: variable 'ansible_distribution_major_version' from source: facts 7554 1726853178.91125: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853178.91253: variable 'type' from source: play vars 7554 1726853178.91257: variable 'state' from source: include params 7554 1726853178.91260: variable 'interface' from source: play vars 7554 1726853178.91263: variable 'current_interfaces' from source: set_fact 7554 1726853178.91272: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7554 1726853178.91275: when evaluation is False, skipping this task 7554 1726853178.91278: _execute() done 7554 1726853178.91282: dumping result to json 7554 1726853178.91284: done dumping result, returning 7554 1726853178.91287: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [02083763-bbaf-bdc3-98b6-0000000010b4] 7554 1726853178.91297: sending task result for task 02083763-bbaf-bdc3-98b6-0000000010b4 7554 1726853178.91367: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000010b4 7554 1726853178.91369: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853178.91441: no more pending results, returning what we have 7554 1726853178.91443: results queue empty 7554 1726853178.91444: checking for any_errors_fatal 7554 1726853178.91448: done checking for any_errors_fatal 7554 1726853178.91448: checking for max_fail_percentage 7554 1726853178.91450: done checking for max_fail_percentage 7554 1726853178.91450: checking to see if all hosts have failed and the running result is not ok 7554 1726853178.91451: done checking to see if all hosts have failed 7554 1726853178.91452: getting the remaining hosts for this loop 7554 1726853178.91453: done getting the remaining hosts for this loop 7554 1726853178.91456: getting the next task for host managed_node3 7554 1726853178.91463: done getting next task for host managed_node3 7554 1726853178.91467: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7554 1726853178.91470: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853178.91489: getting variables 7554 1726853178.91490: in VariableManager get_vars() 7554 1726853178.91527: Calling all_inventory to load vars for managed_node3 7554 1726853178.91529: Calling groups_inventory to load vars for managed_node3 7554 1726853178.91531: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853178.91537: Calling all_plugins_play to load vars for managed_node3 7554 1726853178.91538: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853178.91540: Calling groups_plugins_play to load vars for managed_node3 7554 1726853178.92239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853178.93102: done with get_vars() 7554 1726853178.93115: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:26:18 -0400 (0:00:00.028) 0:00:32.899 ****** 7554 1726853178.93181: entering _queue_task() for managed_node3/include_tasks 7554 1726853178.93368: worker is 1 (out of 1 available) 7554 1726853178.93383: exiting _queue_task() for managed_node3/include_tasks 7554 1726853178.93395: done queuing things up, now waiting for results queue to drain 7554 1726853178.93396: waiting for pending results... 7554 1726853178.93565: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7554 1726853178.93654: in run() - task 02083763-bbaf-bdc3-98b6-0000000000b8 7554 1726853178.93667: variable 'ansible_search_path' from source: unknown 7554 1726853178.93670: variable 'ansible_search_path' from source: unknown 7554 1726853178.93698: calling self._execute() 7554 1726853178.93766: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853178.93770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853178.93780: variable 'omit' from source: magic vars 7554 1726853178.94031: variable 'ansible_distribution_major_version' from source: facts 7554 1726853178.94041: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853178.94049: _execute() done 7554 1726853178.94053: dumping result to json 7554 1726853178.94056: done dumping result, returning 7554 1726853178.94067: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-bdc3-98b6-0000000000b8] 7554 1726853178.94070: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000b8 7554 1726853178.94148: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000b8 7554 1726853178.94150: WORKER PROCESS EXITING 7554 1726853178.94211: no more pending results, returning what we have 7554 1726853178.94215: in VariableManager get_vars() 7554 1726853178.94257: Calling all_inventory to load vars for managed_node3 7554 1726853178.94260: Calling groups_inventory to load vars for managed_node3 7554 1726853178.94262: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853178.94269: Calling all_plugins_play to load vars for managed_node3 7554 1726853178.94273: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853178.94276: Calling groups_plugins_play to load vars for managed_node3 7554 1726853178.95066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853178.95906: done with get_vars() 7554 1726853178.95918: variable 'ansible_search_path' from source: unknown 7554 1726853178.95919: variable 'ansible_search_path' from source: unknown 7554 1726853178.95946: we have included files to process 7554 1726853178.95947: generating all_blocks data 7554 1726853178.95948: done generating all_blocks data 7554 1726853178.95952: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7554 1726853178.95952: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7554 1726853178.95954: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7554 1726853178.96318: done processing included file 7554 1726853178.96319: iterating over new_blocks loaded from include file 7554 1726853178.96320: in VariableManager get_vars() 7554 1726853178.96338: done with get_vars() 7554 1726853178.96339: filtering new block on tags 7554 1726853178.96350: done filtering new block on tags 7554 1726853178.96352: in VariableManager get_vars() 7554 1726853178.96370: done with get_vars() 7554 1726853178.96373: filtering new block on tags 7554 1726853178.96384: done filtering new block on tags 7554 1726853178.96386: in VariableManager get_vars() 7554 1726853178.96401: done with get_vars() 7554 1726853178.96403: filtering new block on tags 7554 1726853178.96412: done filtering new block on tags 7554 1726853178.96413: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7554 1726853178.96417: extending task lists for all hosts with included blocks 7554 1726853178.96851: done extending task lists 7554 1726853178.96852: done processing included files 7554 1726853178.96852: results queue empty 7554 1726853178.96853: checking for any_errors_fatal 7554 1726853178.96855: done checking for any_errors_fatal 7554 1726853178.96855: checking for max_fail_percentage 7554 1726853178.96856: done checking for max_fail_percentage 7554 1726853178.96856: checking to see if all hosts have failed and the running result is not ok 7554 1726853178.96857: done checking to see if all hosts have failed 7554 1726853178.96857: getting the remaining hosts for this loop 7554 1726853178.96858: done getting the remaining hosts for this loop 7554 1726853178.96860: getting the next task for host managed_node3 7554 1726853178.96862: done getting next task for host managed_node3 7554 1726853178.96864: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7554 1726853178.96866: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853178.96874: getting variables 7554 1726853178.96875: in VariableManager get_vars() 7554 1726853178.96887: Calling all_inventory to load vars for managed_node3 7554 1726853178.96889: Calling groups_inventory to load vars for managed_node3 7554 1726853178.96890: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853178.96894: Calling all_plugins_play to load vars for managed_node3 7554 1726853178.96895: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853178.96898: Calling groups_plugins_play to load vars for managed_node3 7554 1726853178.97563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853178.98397: done with get_vars() 7554 1726853178.98410: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:26:18 -0400 (0:00:00.052) 0:00:32.952 ****** 7554 1726853178.98459: entering _queue_task() for managed_node3/setup 7554 1726853178.98707: worker is 1 (out of 1 available) 7554 1726853178.98720: exiting _queue_task() for managed_node3/setup 7554 1726853178.98733: done queuing things up, now waiting for results queue to drain 7554 1726853178.98734: waiting for pending results... 7554 1726853178.98924: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7554 1726853178.99025: in run() - task 02083763-bbaf-bdc3-98b6-000000001381 7554 1726853178.99036: variable 'ansible_search_path' from source: unknown 7554 1726853178.99040: variable 'ansible_search_path' from source: unknown 7554 1726853178.99069: calling self._execute() 7554 1726853178.99151: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853178.99155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853178.99165: variable 'omit' from source: magic vars 7554 1726853178.99448: variable 'ansible_distribution_major_version' from source: facts 7554 1726853178.99455: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853178.99610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853179.01047: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853179.01099: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853179.01126: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853179.01154: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853179.01176: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853179.01232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853179.01254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853179.01281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853179.01306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853179.01316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853179.01356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853179.01379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853179.01396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853179.01421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853179.01432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853179.01544: variable '__network_required_facts' from source: role '' defaults 7554 1726853179.01554: variable 'ansible_facts' from source: unknown 7554 1726853179.02014: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7554 1726853179.02018: when evaluation is False, skipping this task 7554 1726853179.02020: _execute() done 7554 1726853179.02023: dumping result to json 7554 1726853179.02025: done dumping result, returning 7554 1726853179.02033: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-bdc3-98b6-000000001381] 7554 1726853179.02035: sending task result for task 02083763-bbaf-bdc3-98b6-000000001381 7554 1726853179.02126: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001381 7554 1726853179.02128: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853179.02177: no more pending results, returning what we have 7554 1726853179.02181: results queue empty 7554 1726853179.02182: checking for any_errors_fatal 7554 1726853179.02183: done checking for any_errors_fatal 7554 1726853179.02184: checking for max_fail_percentage 7554 1726853179.02185: done checking for max_fail_percentage 7554 1726853179.02186: checking to see if all hosts have failed and the running result is not ok 7554 1726853179.02187: done checking to see if all hosts have failed 7554 1726853179.02188: getting the remaining hosts for this loop 7554 1726853179.02189: done getting the remaining hosts for this loop 7554 1726853179.02193: getting the next task for host managed_node3 7554 1726853179.02201: done getting next task for host managed_node3 7554 1726853179.02205: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7554 1726853179.02208: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853179.02230: getting variables 7554 1726853179.02232: in VariableManager get_vars() 7554 1726853179.02281: Calling all_inventory to load vars for managed_node3 7554 1726853179.02284: Calling groups_inventory to load vars for managed_node3 7554 1726853179.02287: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853179.02296: Calling all_plugins_play to load vars for managed_node3 7554 1726853179.02298: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853179.02300: Calling groups_plugins_play to load vars for managed_node3 7554 1726853179.03118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853179.03980: done with get_vars() 7554 1726853179.03996: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:26:19 -0400 (0:00:00.056) 0:00:33.008 ****** 7554 1726853179.04073: entering _queue_task() for managed_node3/stat 7554 1726853179.04304: worker is 1 (out of 1 available) 7554 1726853179.04319: exiting _queue_task() for managed_node3/stat 7554 1726853179.04331: done queuing things up, now waiting for results queue to drain 7554 1726853179.04333: waiting for pending results... 7554 1726853179.04518: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7554 1726853179.04625: in run() - task 02083763-bbaf-bdc3-98b6-000000001383 7554 1726853179.04637: variable 'ansible_search_path' from source: unknown 7554 1726853179.04641: variable 'ansible_search_path' from source: unknown 7554 1726853179.04673: calling self._execute() 7554 1726853179.04747: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853179.04751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853179.04759: variable 'omit' from source: magic vars 7554 1726853179.05036: variable 'ansible_distribution_major_version' from source: facts 7554 1726853179.05047: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853179.05166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853179.05360: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853179.05394: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853179.05419: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853179.05448: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853179.05547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853179.05560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853179.05581: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853179.05599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853179.05665: variable '__network_is_ostree' from source: set_fact 7554 1726853179.05672: Evaluated conditional (not __network_is_ostree is defined): False 7554 1726853179.05675: when evaluation is False, skipping this task 7554 1726853179.05678: _execute() done 7554 1726853179.05680: dumping result to json 7554 1726853179.05683: done dumping result, returning 7554 1726853179.05691: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-bdc3-98b6-000000001383] 7554 1726853179.05695: sending task result for task 02083763-bbaf-bdc3-98b6-000000001383 7554 1726853179.05784: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001383 7554 1726853179.05786: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7554 1726853179.05834: no more pending results, returning what we have 7554 1726853179.05837: results queue empty 7554 1726853179.05838: checking for any_errors_fatal 7554 1726853179.05846: done checking for any_errors_fatal 7554 1726853179.05847: checking for max_fail_percentage 7554 1726853179.05848: done checking for max_fail_percentage 7554 1726853179.05849: checking to see if all hosts have failed and the running result is not ok 7554 1726853179.05851: done checking to see if all hosts have failed 7554 1726853179.05851: getting the remaining hosts for this loop 7554 1726853179.05853: done getting the remaining hosts for this loop 7554 1726853179.05856: getting the next task for host managed_node3 7554 1726853179.05864: done getting next task for host managed_node3 7554 1726853179.05867: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7554 1726853179.05872: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853179.05893: getting variables 7554 1726853179.05897: in VariableManager get_vars() 7554 1726853179.05939: Calling all_inventory to load vars for managed_node3 7554 1726853179.05943: Calling groups_inventory to load vars for managed_node3 7554 1726853179.05946: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853179.05954: Calling all_plugins_play to load vars for managed_node3 7554 1726853179.05957: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853179.05959: Calling groups_plugins_play to load vars for managed_node3 7554 1726853179.06868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853179.07725: done with get_vars() 7554 1726853179.07743: done getting variables 7554 1726853179.07787: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:26:19 -0400 (0:00:00.037) 0:00:33.045 ****** 7554 1726853179.07813: entering _queue_task() for managed_node3/set_fact 7554 1726853179.08054: worker is 1 (out of 1 available) 7554 1726853179.08067: exiting _queue_task() for managed_node3/set_fact 7554 1726853179.08082: done queuing things up, now waiting for results queue to drain 7554 1726853179.08085: waiting for pending results... 7554 1726853179.08264: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7554 1726853179.08366: in run() - task 02083763-bbaf-bdc3-98b6-000000001384 7554 1726853179.08380: variable 'ansible_search_path' from source: unknown 7554 1726853179.08384: variable 'ansible_search_path' from source: unknown 7554 1726853179.08409: calling self._execute() 7554 1726853179.08484: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853179.08488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853179.08498: variable 'omit' from source: magic vars 7554 1726853179.08766: variable 'ansible_distribution_major_version' from source: facts 7554 1726853179.08777: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853179.08892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853179.09082: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853179.09112: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853179.09136: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853179.09161: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853179.09251: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853179.09269: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853179.09291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853179.09311: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853179.09375: variable '__network_is_ostree' from source: set_fact 7554 1726853179.09382: Evaluated conditional (not __network_is_ostree is defined): False 7554 1726853179.09385: when evaluation is False, skipping this task 7554 1726853179.09388: _execute() done 7554 1726853179.09390: dumping result to json 7554 1726853179.09392: done dumping result, returning 7554 1726853179.09401: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-bdc3-98b6-000000001384] 7554 1726853179.09404: sending task result for task 02083763-bbaf-bdc3-98b6-000000001384 7554 1726853179.09490: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001384 7554 1726853179.09493: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7554 1726853179.09563: no more pending results, returning what we have 7554 1726853179.09567: results queue empty 7554 1726853179.09567: checking for any_errors_fatal 7554 1726853179.09577: done checking for any_errors_fatal 7554 1726853179.09578: checking for max_fail_percentage 7554 1726853179.09579: done checking for max_fail_percentage 7554 1726853179.09580: checking to see if all hosts have failed and the running result is not ok 7554 1726853179.09581: done checking to see if all hosts have failed 7554 1726853179.09582: getting the remaining hosts for this loop 7554 1726853179.09583: done getting the remaining hosts for this loop 7554 1726853179.09587: getting the next task for host managed_node3 7554 1726853179.09595: done getting next task for host managed_node3 7554 1726853179.09598: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7554 1726853179.09602: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853179.09617: getting variables 7554 1726853179.09618: in VariableManager get_vars() 7554 1726853179.09662: Calling all_inventory to load vars for managed_node3 7554 1726853179.09665: Calling groups_inventory to load vars for managed_node3 7554 1726853179.09667: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853179.09681: Calling all_plugins_play to load vars for managed_node3 7554 1726853179.09684: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853179.09687: Calling groups_plugins_play to load vars for managed_node3 7554 1726853179.10444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853179.11318: done with get_vars() 7554 1726853179.11334: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:26:19 -0400 (0:00:00.035) 0:00:33.081 ****** 7554 1726853179.11403: entering _queue_task() for managed_node3/service_facts 7554 1726853179.11636: worker is 1 (out of 1 available) 7554 1726853179.11653: exiting _queue_task() for managed_node3/service_facts 7554 1726853179.11667: done queuing things up, now waiting for results queue to drain 7554 1726853179.11669: waiting for pending results... 7554 1726853179.11856: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7554 1726853179.11961: in run() - task 02083763-bbaf-bdc3-98b6-000000001386 7554 1726853179.11975: variable 'ansible_search_path' from source: unknown 7554 1726853179.11979: variable 'ansible_search_path' from source: unknown 7554 1726853179.12010: calling self._execute() 7554 1726853179.12080: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853179.12085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853179.12094: variable 'omit' from source: magic vars 7554 1726853179.12364: variable 'ansible_distribution_major_version' from source: facts 7554 1726853179.12375: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853179.12381: variable 'omit' from source: magic vars 7554 1726853179.12430: variable 'omit' from source: magic vars 7554 1726853179.12458: variable 'omit' from source: magic vars 7554 1726853179.12490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853179.12518: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853179.12534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853179.12550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853179.12559: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853179.12584: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853179.12587: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853179.12590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853179.12661: Set connection var ansible_shell_executable to /bin/sh 7554 1726853179.12665: Set connection var ansible_pipelining to False 7554 1726853179.12667: Set connection var ansible_shell_type to sh 7554 1726853179.12670: Set connection var ansible_connection to ssh 7554 1726853179.12681: Set connection var ansible_timeout to 10 7554 1726853179.12685: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853179.12701: variable 'ansible_shell_executable' from source: unknown 7554 1726853179.12704: variable 'ansible_connection' from source: unknown 7554 1726853179.12707: variable 'ansible_module_compression' from source: unknown 7554 1726853179.12709: variable 'ansible_shell_type' from source: unknown 7554 1726853179.12712: variable 'ansible_shell_executable' from source: unknown 7554 1726853179.12714: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853179.12716: variable 'ansible_pipelining' from source: unknown 7554 1726853179.12719: variable 'ansible_timeout' from source: unknown 7554 1726853179.12723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853179.12862: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853179.12873: variable 'omit' from source: magic vars 7554 1726853179.12876: starting attempt loop 7554 1726853179.12878: running the handler 7554 1726853179.12896: _low_level_execute_command(): starting 7554 1726853179.12899: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853179.13413: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853179.13417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853179.13421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853179.13425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853179.13480: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853179.13483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853179.13486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853179.13560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853179.15275: stdout chunk (state=3): >>>/root <<< 7554 1726853179.15380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853179.15406: stderr chunk (state=3): >>><<< 7554 1726853179.15409: stdout chunk (state=3): >>><<< 7554 1726853179.15429: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853179.15440: _low_level_execute_command(): starting 7554 1726853179.15445: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853179.154288-8828-237548075113668 `" && echo ansible-tmp-1726853179.154288-8828-237548075113668="` echo /root/.ansible/tmp/ansible-tmp-1726853179.154288-8828-237548075113668 `" ) && sleep 0' 7554 1726853179.15883: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853179.15886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853179.15889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853179.15898: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853179.15901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853179.15950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853179.15953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853179.15959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853179.16026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853179.17953: stdout chunk (state=3): >>>ansible-tmp-1726853179.154288-8828-237548075113668=/root/.ansible/tmp/ansible-tmp-1726853179.154288-8828-237548075113668 <<< 7554 1726853179.18056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853179.18084: stderr chunk (state=3): >>><<< 7554 1726853179.18088: stdout chunk (state=3): >>><<< 7554 1726853179.18100: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853179.154288-8828-237548075113668=/root/.ansible/tmp/ansible-tmp-1726853179.154288-8828-237548075113668 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853179.18137: variable 'ansible_module_compression' from source: unknown 7554 1726853179.18177: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 7554 1726853179.18210: variable 'ansible_facts' from source: unknown 7554 1726853179.18269: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853179.154288-8828-237548075113668/AnsiballZ_service_facts.py 7554 1726853179.18369: Sending initial data 7554 1726853179.18375: Sent initial data (159 bytes) 7554 1726853179.18828: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853179.18831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853179.18834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853179.18836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853179.18838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853179.18888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853179.18892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853179.18964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853179.20551: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7554 1726853179.20555: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853179.20608: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853179.20668: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpqo7drhim /root/.ansible/tmp/ansible-tmp-1726853179.154288-8828-237548075113668/AnsiballZ_service_facts.py <<< 7554 1726853179.20673: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853179.154288-8828-237548075113668/AnsiballZ_service_facts.py" <<< 7554 1726853179.20724: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpqo7drhim" to remote "/root/.ansible/tmp/ansible-tmp-1726853179.154288-8828-237548075113668/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853179.154288-8828-237548075113668/AnsiballZ_service_facts.py" <<< 7554 1726853179.21354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853179.21393: stderr chunk (state=3): >>><<< 7554 1726853179.21396: stdout chunk (state=3): >>><<< 7554 1726853179.21457: done transferring module to remote 7554 1726853179.21466: _low_level_execute_command(): starting 7554 1726853179.21472: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853179.154288-8828-237548075113668/ /root/.ansible/tmp/ansible-tmp-1726853179.154288-8828-237548075113668/AnsiballZ_service_facts.py && sleep 0' 7554 1726853179.21908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853179.21911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853179.21914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853179.21919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853179.21922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853179.21961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853179.21965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853179.22032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853179.23874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853179.23897: stderr chunk (state=3): >>><<< 7554 1726853179.23900: stdout chunk (state=3): >>><<< 7554 1726853179.23912: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853179.23915: _low_level_execute_command(): starting 7554 1726853179.23919: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853179.154288-8828-237548075113668/AnsiballZ_service_facts.py && sleep 0' 7554 1726853179.24350: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853179.24353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853179.24355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853179.24357: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853179.24359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853179.24414: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853179.24421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853179.24423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853179.24482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853180.81136: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 7554 1726853180.81159: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 7554 1726853180.81167: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 7554 1726853180.81179: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_sta<<< 7554 1726853180.81200: stdout chunk (state=3): >>>t.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service<<< 7554 1726853180.81218: stdout chunk (state=3): >>>": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7554 1726853180.82976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853180.82981: stderr chunk (state=3): >>><<< 7554 1726853180.82984: stdout chunk (state=3): >>><<< 7554 1726853180.82995: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853180.84007: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853179.154288-8828-237548075113668/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853180.84021: _low_level_execute_command(): starting 7554 1726853180.84030: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853179.154288-8828-237548075113668/ > /dev/null 2>&1 && sleep 0' 7554 1726853180.84696: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853180.84709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853180.84730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853180.84750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853180.84794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853180.84805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853180.84815: stderr chunk (state=3): >>>debug2: match found <<< 7554 1726853180.84838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853180.84909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853180.84924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853180.84959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853180.85062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853180.87011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853180.87015: stdout chunk (state=3): >>><<< 7554 1726853180.87023: stderr chunk (state=3): >>><<< 7554 1726853180.87038: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853180.87046: handler run complete 7554 1726853180.87237: variable 'ansible_facts' from source: unknown 7554 1726853180.87408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853180.87901: variable 'ansible_facts' from source: unknown 7554 1726853180.88058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853180.88277: attempt loop complete, returning result 7554 1726853180.88282: _execute() done 7554 1726853180.88285: dumping result to json 7554 1726853180.88347: done dumping result, returning 7554 1726853180.88355: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-bdc3-98b6-000000001386] 7554 1726853180.88361: sending task result for task 02083763-bbaf-bdc3-98b6-000000001386 7554 1726853180.89448: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001386 7554 1726853180.89451: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853180.89621: no more pending results, returning what we have 7554 1726853180.89624: results queue empty 7554 1726853180.89625: checking for any_errors_fatal 7554 1726853180.89629: done checking for any_errors_fatal 7554 1726853180.89629: checking for max_fail_percentage 7554 1726853180.89631: done checking for max_fail_percentage 7554 1726853180.89632: checking to see if all hosts have failed and the running result is not ok 7554 1726853180.89632: done checking to see if all hosts have failed 7554 1726853180.89633: getting the remaining hosts for this loop 7554 1726853180.89634: done getting the remaining hosts for this loop 7554 1726853180.89638: getting the next task for host managed_node3 7554 1726853180.89646: done getting next task for host managed_node3 7554 1726853180.89650: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7554 1726853180.89657: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853180.89668: getting variables 7554 1726853180.89670: in VariableManager get_vars() 7554 1726853180.89713: Calling all_inventory to load vars for managed_node3 7554 1726853180.89716: Calling groups_inventory to load vars for managed_node3 7554 1726853180.89719: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853180.89728: Calling all_plugins_play to load vars for managed_node3 7554 1726853180.89730: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853180.89733: Calling groups_plugins_play to load vars for managed_node3 7554 1726853180.91004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853180.92696: done with get_vars() 7554 1726853180.92717: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:26:20 -0400 (0:00:01.814) 0:00:34.895 ****** 7554 1726853180.92829: entering _queue_task() for managed_node3/package_facts 7554 1726853180.93389: worker is 1 (out of 1 available) 7554 1726853180.93398: exiting _queue_task() for managed_node3/package_facts 7554 1726853180.93410: done queuing things up, now waiting for results queue to drain 7554 1726853180.93411: waiting for pending results... 7554 1726853180.93545: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7554 1726853180.93751: in run() - task 02083763-bbaf-bdc3-98b6-000000001387 7554 1726853180.93755: variable 'ansible_search_path' from source: unknown 7554 1726853180.93758: variable 'ansible_search_path' from source: unknown 7554 1726853180.93793: calling self._execute() 7554 1726853180.93968: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853180.93974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853180.93977: variable 'omit' from source: magic vars 7554 1726853180.94320: variable 'ansible_distribution_major_version' from source: facts 7554 1726853180.94337: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853180.94351: variable 'omit' from source: magic vars 7554 1726853180.94437: variable 'omit' from source: magic vars 7554 1726853180.94480: variable 'omit' from source: magic vars 7554 1726853180.94531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853180.94575: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853180.94602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853180.94632: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853180.94676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853180.94687: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853180.94695: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853180.94703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853180.94839: Set connection var ansible_shell_executable to /bin/sh 7554 1726853180.94856: Set connection var ansible_pipelining to False 7554 1726853180.94947: Set connection var ansible_shell_type to sh 7554 1726853180.94951: Set connection var ansible_connection to ssh 7554 1726853180.94953: Set connection var ansible_timeout to 10 7554 1726853180.94956: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853180.94958: variable 'ansible_shell_executable' from source: unknown 7554 1726853180.94960: variable 'ansible_connection' from source: unknown 7554 1726853180.94963: variable 'ansible_module_compression' from source: unknown 7554 1726853180.94965: variable 'ansible_shell_type' from source: unknown 7554 1726853180.94967: variable 'ansible_shell_executable' from source: unknown 7554 1726853180.94969: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853180.94973: variable 'ansible_pipelining' from source: unknown 7554 1726853180.94975: variable 'ansible_timeout' from source: unknown 7554 1726853180.94977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853180.95168: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853180.95186: variable 'omit' from source: magic vars 7554 1726853180.95195: starting attempt loop 7554 1726853180.95202: running the handler 7554 1726853180.95219: _low_level_execute_command(): starting 7554 1726853180.95231: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853180.95993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853180.96056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853180.96090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853180.96183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853180.97892: stdout chunk (state=3): >>>/root <<< 7554 1726853180.98077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853180.98081: stdout chunk (state=3): >>><<< 7554 1726853180.98083: stderr chunk (state=3): >>><<< 7554 1726853180.98087: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853180.98098: _low_level_execute_command(): starting 7554 1726853180.98109: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853180.980661-8868-95303299154885 `" && echo ansible-tmp-1726853180.980661-8868-95303299154885="` echo /root/.ansible/tmp/ansible-tmp-1726853180.980661-8868-95303299154885 `" ) && sleep 0' 7554 1726853180.98751: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853180.98766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853180.98789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853180.98811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853180.98835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853180.98941: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853180.98975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853180.99077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853181.01051: stdout chunk (state=3): >>>ansible-tmp-1726853180.980661-8868-95303299154885=/root/.ansible/tmp/ansible-tmp-1726853180.980661-8868-95303299154885 <<< 7554 1726853181.01185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853181.01210: stderr chunk (state=3): >>><<< 7554 1726853181.01221: stdout chunk (state=3): >>><<< 7554 1726853181.01476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853180.980661-8868-95303299154885=/root/.ansible/tmp/ansible-tmp-1726853180.980661-8868-95303299154885 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853181.01480: variable 'ansible_module_compression' from source: unknown 7554 1726853181.01483: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 7554 1726853181.01485: variable 'ansible_facts' from source: unknown 7554 1726853181.01632: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853180.980661-8868-95303299154885/AnsiballZ_package_facts.py 7554 1726853181.01791: Sending initial data 7554 1726853181.01800: Sent initial data (158 bytes) 7554 1726853181.02438: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853181.02486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853181.02500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853181.02510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853181.02589: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853181.02609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853181.02700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853181.04342: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7554 1726853181.04360: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853181.04444: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853181.04507: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp416wdaiu /root/.ansible/tmp/ansible-tmp-1726853180.980661-8868-95303299154885/AnsiballZ_package_facts.py <<< 7554 1726853181.04510: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853180.980661-8868-95303299154885/AnsiballZ_package_facts.py" <<< 7554 1726853181.04567: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp416wdaiu" to remote "/root/.ansible/tmp/ansible-tmp-1726853180.980661-8868-95303299154885/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853180.980661-8868-95303299154885/AnsiballZ_package_facts.py" <<< 7554 1726853181.06214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853181.06351: stderr chunk (state=3): >>><<< 7554 1726853181.06355: stdout chunk (state=3): >>><<< 7554 1726853181.06357: done transferring module to remote 7554 1726853181.06359: _low_level_execute_command(): starting 7554 1726853181.06361: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853180.980661-8868-95303299154885/ /root/.ansible/tmp/ansible-tmp-1726853180.980661-8868-95303299154885/AnsiballZ_package_facts.py && sleep 0' 7554 1726853181.06886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853181.06983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853181.06999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853181.07011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853181.07099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853181.09136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853181.09140: stdout chunk (state=3): >>><<< 7554 1726853181.09177: stderr chunk (state=3): >>><<< 7554 1726853181.09180: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853181.09189: _low_level_execute_command(): starting 7554 1726853181.09191: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853180.980661-8868-95303299154885/AnsiballZ_package_facts.py && sleep 0' 7554 1726853181.09923: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853181.09926: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853181.09929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853181.09931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853181.09933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853181.09943: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853181.09955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853181.09976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853181.09979: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853181.10031: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7554 1726853181.10034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853181.10036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853181.10038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853181.10040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853181.10042: stderr chunk (state=3): >>>debug2: match found <<< 7554 1726853181.10043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853181.10095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853181.10106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853181.10123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853181.10209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853181.55782: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 7554 1726853181.55809: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7554 1726853181.57473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853181.57586: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 7554 1726853181.57590: stdout chunk (state=3): >>><<< 7554 1726853181.57596: stderr chunk (state=3): >>><<< 7554 1726853181.57669: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853181.61961: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853180.980661-8868-95303299154885/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853181.61965: _low_level_execute_command(): starting 7554 1726853181.61968: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853180.980661-8868-95303299154885/ > /dev/null 2>&1 && sleep 0' 7554 1726853181.63093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853181.63097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853181.63309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853181.63384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853181.63431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853181.63551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853181.65444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853181.65554: stderr chunk (state=3): >>><<< 7554 1726853181.65557: stdout chunk (state=3): >>><<< 7554 1726853181.65589: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853181.65592: handler run complete 7554 1726853181.67757: variable 'ansible_facts' from source: unknown 7554 1726853181.68406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853181.71097: variable 'ansible_facts' from source: unknown 7554 1726853181.72080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853181.73466: attempt loop complete, returning result 7554 1726853181.73551: _execute() done 7554 1726853181.73560: dumping result to json 7554 1726853181.73967: done dumping result, returning 7554 1726853181.74078: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-bdc3-98b6-000000001387] 7554 1726853181.74081: sending task result for task 02083763-bbaf-bdc3-98b6-000000001387 7554 1726853181.79617: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001387 7554 1726853181.79621: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853181.79769: no more pending results, returning what we have 7554 1726853181.79773: results queue empty 7554 1726853181.79774: checking for any_errors_fatal 7554 1726853181.79779: done checking for any_errors_fatal 7554 1726853181.79779: checking for max_fail_percentage 7554 1726853181.79781: done checking for max_fail_percentage 7554 1726853181.79782: checking to see if all hosts have failed and the running result is not ok 7554 1726853181.79782: done checking to see if all hosts have failed 7554 1726853181.79783: getting the remaining hosts for this loop 7554 1726853181.79784: done getting the remaining hosts for this loop 7554 1726853181.79787: getting the next task for host managed_node3 7554 1726853181.79794: done getting next task for host managed_node3 7554 1726853181.79797: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7554 1726853181.79800: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853181.79809: getting variables 7554 1726853181.79810: in VariableManager get_vars() 7554 1726853181.79846: Calling all_inventory to load vars for managed_node3 7554 1726853181.79849: Calling groups_inventory to load vars for managed_node3 7554 1726853181.79851: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853181.79858: Calling all_plugins_play to load vars for managed_node3 7554 1726853181.79861: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853181.79863: Calling groups_plugins_play to load vars for managed_node3 7554 1726853181.82250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853181.85558: done with get_vars() 7554 1726853181.85594: done getting variables 7554 1726853181.85880: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:26:21 -0400 (0:00:00.930) 0:00:35.826 ****** 7554 1726853181.85915: entering _queue_task() for managed_node3/debug 7554 1726853181.86541: worker is 1 (out of 1 available) 7554 1726853181.86555: exiting _queue_task() for managed_node3/debug 7554 1726853181.86567: done queuing things up, now waiting for results queue to drain 7554 1726853181.86568: waiting for pending results... 7554 1726853181.87289: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7554 1726853181.87307: in run() - task 02083763-bbaf-bdc3-98b6-0000000000b9 7554 1726853181.87328: variable 'ansible_search_path' from source: unknown 7554 1726853181.87337: variable 'ansible_search_path' from source: unknown 7554 1726853181.87382: calling self._execute() 7554 1726853181.87676: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853181.87689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853181.87707: variable 'omit' from source: magic vars 7554 1726853181.88474: variable 'ansible_distribution_major_version' from source: facts 7554 1726853181.88494: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853181.88506: variable 'omit' from source: magic vars 7554 1726853181.88564: variable 'omit' from source: magic vars 7554 1726853181.88756: variable 'network_provider' from source: set_fact 7554 1726853181.88780: variable 'omit' from source: magic vars 7554 1726853181.88820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853181.88865: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853181.88892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853181.88911: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853181.88926: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853181.88965: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853181.88975: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853181.88984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853181.89094: Set connection var ansible_shell_executable to /bin/sh 7554 1726853181.89107: Set connection var ansible_pipelining to False 7554 1726853181.89113: Set connection var ansible_shell_type to sh 7554 1726853181.89119: Set connection var ansible_connection to ssh 7554 1726853181.89132: Set connection var ansible_timeout to 10 7554 1726853181.89140: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853181.89174: variable 'ansible_shell_executable' from source: unknown 7554 1726853181.89185: variable 'ansible_connection' from source: unknown 7554 1726853181.89192: variable 'ansible_module_compression' from source: unknown 7554 1726853181.89199: variable 'ansible_shell_type' from source: unknown 7554 1726853181.89205: variable 'ansible_shell_executable' from source: unknown 7554 1726853181.89211: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853181.89218: variable 'ansible_pipelining' from source: unknown 7554 1726853181.89224: variable 'ansible_timeout' from source: unknown 7554 1726853181.89232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853181.89404: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853181.89495: variable 'omit' from source: magic vars 7554 1726853181.89498: starting attempt loop 7554 1726853181.89501: running the handler 7554 1726853181.89503: handler run complete 7554 1726853181.89506: attempt loop complete, returning result 7554 1726853181.89514: _execute() done 7554 1726853181.89521: dumping result to json 7554 1726853181.89527: done dumping result, returning 7554 1726853181.89539: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-bdc3-98b6-0000000000b9] 7554 1726853181.89573: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000b9 ok: [managed_node3] => {} MSG: Using network provider: nm 7554 1726853181.89749: no more pending results, returning what we have 7554 1726853181.89752: results queue empty 7554 1726853181.89753: checking for any_errors_fatal 7554 1726853181.89762: done checking for any_errors_fatal 7554 1726853181.89762: checking for max_fail_percentage 7554 1726853181.89764: done checking for max_fail_percentage 7554 1726853181.89765: checking to see if all hosts have failed and the running result is not ok 7554 1726853181.89766: done checking to see if all hosts have failed 7554 1726853181.89766: getting the remaining hosts for this loop 7554 1726853181.89768: done getting the remaining hosts for this loop 7554 1726853181.89772: getting the next task for host managed_node3 7554 1726853181.89778: done getting next task for host managed_node3 7554 1726853181.89782: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7554 1726853181.89785: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853181.89796: getting variables 7554 1726853181.89797: in VariableManager get_vars() 7554 1726853181.89842: Calling all_inventory to load vars for managed_node3 7554 1726853181.89845: Calling groups_inventory to load vars for managed_node3 7554 1726853181.89847: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853181.89856: Calling all_plugins_play to load vars for managed_node3 7554 1726853181.89859: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853181.89862: Calling groups_plugins_play to load vars for managed_node3 7554 1726853181.91086: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000b9 7554 1726853181.91779: WORKER PROCESS EXITING 7554 1726853181.92600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853181.95510: done with get_vars() 7554 1726853181.95539: done getting variables 7554 1726853181.95902: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:26:21 -0400 (0:00:00.100) 0:00:35.926 ****** 7554 1726853181.95936: entering _queue_task() for managed_node3/fail 7554 1726853181.96686: worker is 1 (out of 1 available) 7554 1726853181.96698: exiting _queue_task() for managed_node3/fail 7554 1726853181.96712: done queuing things up, now waiting for results queue to drain 7554 1726853181.96714: waiting for pending results... 7554 1726853181.97418: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7554 1726853181.98019: in run() - task 02083763-bbaf-bdc3-98b6-0000000000ba 7554 1726853181.98097: variable 'ansible_search_path' from source: unknown 7554 1726853181.98277: variable 'ansible_search_path' from source: unknown 7554 1726853181.98282: calling self._execute() 7554 1726853181.98421: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853181.98434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853181.98491: variable 'omit' from source: magic vars 7554 1726853181.99289: variable 'ansible_distribution_major_version' from source: facts 7554 1726853181.99304: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853181.99420: variable 'network_state' from source: role '' defaults 7554 1726853181.99777: Evaluated conditional (network_state != {}): False 7554 1726853181.99780: when evaluation is False, skipping this task 7554 1726853181.99782: _execute() done 7554 1726853181.99784: dumping result to json 7554 1726853181.99786: done dumping result, returning 7554 1726853181.99789: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-bdc3-98b6-0000000000ba] 7554 1726853181.99792: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000ba 7554 1726853181.99861: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000ba 7554 1726853181.99865: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853181.99918: no more pending results, returning what we have 7554 1726853181.99921: results queue empty 7554 1726853181.99922: checking for any_errors_fatal 7554 1726853181.99933: done checking for any_errors_fatal 7554 1726853181.99934: checking for max_fail_percentage 7554 1726853181.99935: done checking for max_fail_percentage 7554 1726853181.99936: checking to see if all hosts have failed and the running result is not ok 7554 1726853181.99937: done checking to see if all hosts have failed 7554 1726853181.99938: getting the remaining hosts for this loop 7554 1726853181.99940: done getting the remaining hosts for this loop 7554 1726853181.99944: getting the next task for host managed_node3 7554 1726853181.99950: done getting next task for host managed_node3 7554 1726853181.99954: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7554 1726853181.99958: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853181.99985: getting variables 7554 1726853181.99987: in VariableManager get_vars() 7554 1726853182.00041: Calling all_inventory to load vars for managed_node3 7554 1726853182.00045: Calling groups_inventory to load vars for managed_node3 7554 1726853182.00048: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853182.00061: Calling all_plugins_play to load vars for managed_node3 7554 1726853182.00064: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853182.00067: Calling groups_plugins_play to load vars for managed_node3 7554 1726853182.02943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853182.06729: done with get_vars() 7554 1726853182.06755: done getting variables 7554 1726853182.06900: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:26:22 -0400 (0:00:00.111) 0:00:36.038 ****** 7554 1726853182.07109: entering _queue_task() for managed_node3/fail 7554 1726853182.08069: worker is 1 (out of 1 available) 7554 1726853182.08482: exiting _queue_task() for managed_node3/fail 7554 1726853182.08492: done queuing things up, now waiting for results queue to drain 7554 1726853182.08494: waiting for pending results... 7554 1726853182.08609: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7554 1726853182.09178: in run() - task 02083763-bbaf-bdc3-98b6-0000000000bb 7554 1726853182.09183: variable 'ansible_search_path' from source: unknown 7554 1726853182.09186: variable 'ansible_search_path' from source: unknown 7554 1726853182.09189: calling self._execute() 7554 1726853182.09230: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853182.09237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853182.09249: variable 'omit' from source: magic vars 7554 1726853182.10057: variable 'ansible_distribution_major_version' from source: facts 7554 1726853182.10067: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853182.10391: variable 'network_state' from source: role '' defaults 7554 1726853182.10406: Evaluated conditional (network_state != {}): False 7554 1726853182.10409: when evaluation is False, skipping this task 7554 1726853182.10412: _execute() done 7554 1726853182.10415: dumping result to json 7554 1726853182.10417: done dumping result, returning 7554 1726853182.10427: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-bdc3-98b6-0000000000bb] 7554 1726853182.10432: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000bb skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853182.10589: no more pending results, returning what we have 7554 1726853182.10592: results queue empty 7554 1726853182.10593: checking for any_errors_fatal 7554 1726853182.10601: done checking for any_errors_fatal 7554 1726853182.10602: checking for max_fail_percentage 7554 1726853182.10604: done checking for max_fail_percentage 7554 1726853182.10605: checking to see if all hosts have failed and the running result is not ok 7554 1726853182.10606: done checking to see if all hosts have failed 7554 1726853182.10607: getting the remaining hosts for this loop 7554 1726853182.10608: done getting the remaining hosts for this loop 7554 1726853182.10612: getting the next task for host managed_node3 7554 1726853182.10620: done getting next task for host managed_node3 7554 1726853182.10624: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7554 1726853182.10628: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853182.10660: getting variables 7554 1726853182.10663: in VariableManager get_vars() 7554 1726853182.10715: Calling all_inventory to load vars for managed_node3 7554 1726853182.10718: Calling groups_inventory to load vars for managed_node3 7554 1726853182.10721: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853182.10733: Calling all_plugins_play to load vars for managed_node3 7554 1726853182.10737: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853182.10739: Calling groups_plugins_play to load vars for managed_node3 7554 1726853182.11592: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000bb 7554 1726853182.11597: WORKER PROCESS EXITING 7554 1726853182.13686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853182.16207: done with get_vars() 7554 1726853182.16247: done getting variables 7554 1726853182.16310: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:26:22 -0400 (0:00:00.092) 0:00:36.131 ****** 7554 1726853182.16352: entering _queue_task() for managed_node3/fail 7554 1726853182.16817: worker is 1 (out of 1 available) 7554 1726853182.16830: exiting _queue_task() for managed_node3/fail 7554 1726853182.16842: done queuing things up, now waiting for results queue to drain 7554 1726853182.16843: waiting for pending results... 7554 1726853182.17067: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7554 1726853182.17209: in run() - task 02083763-bbaf-bdc3-98b6-0000000000bc 7554 1726853182.17225: variable 'ansible_search_path' from source: unknown 7554 1726853182.17229: variable 'ansible_search_path' from source: unknown 7554 1726853182.17265: calling self._execute() 7554 1726853182.17364: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853182.17373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853182.17383: variable 'omit' from source: magic vars 7554 1726853182.17769: variable 'ansible_distribution_major_version' from source: facts 7554 1726853182.17783: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853182.17966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853182.20249: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853182.20317: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853182.20354: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853182.20389: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853182.20415: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853182.20578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.20582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.20585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.20607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.20629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.20735: variable 'ansible_distribution_major_version' from source: facts 7554 1726853182.20764: Evaluated conditional (ansible_distribution_major_version | int > 9): True 7554 1726853182.20889: variable 'ansible_distribution' from source: facts 7554 1726853182.20899: variable '__network_rh_distros' from source: role '' defaults 7554 1726853182.20914: Evaluated conditional (ansible_distribution in __network_rh_distros): True 7554 1726853182.21162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.21196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.21223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.21260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.21278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.21331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.21358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.21389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.21435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.21519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.21523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.21526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.21546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.21590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.21609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.21951: variable 'network_connections' from source: task vars 7554 1726853182.21975: variable 'interface' from source: play vars 7554 1726853182.22039: variable 'interface' from source: play vars 7554 1726853182.22064: variable 'network_state' from source: role '' defaults 7554 1726853182.22127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853182.22305: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853182.22344: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853182.22386: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853182.22496: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853182.22499: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853182.22502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853182.22531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.22561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853182.22608: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 7554 1726853182.22617: when evaluation is False, skipping this task 7554 1726853182.22631: _execute() done 7554 1726853182.22639: dumping result to json 7554 1726853182.22647: done dumping result, returning 7554 1726853182.22659: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-bdc3-98b6-0000000000bc] 7554 1726853182.22672: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000bc 7554 1726853182.22799: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000bc 7554 1726853182.22802: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 7554 1726853182.22856: no more pending results, returning what we have 7554 1726853182.22860: results queue empty 7554 1726853182.22861: checking for any_errors_fatal 7554 1726853182.22870: done checking for any_errors_fatal 7554 1726853182.22872: checking for max_fail_percentage 7554 1726853182.22874: done checking for max_fail_percentage 7554 1726853182.22875: checking to see if all hosts have failed and the running result is not ok 7554 1726853182.22877: done checking to see if all hosts have failed 7554 1726853182.22877: getting the remaining hosts for this loop 7554 1726853182.22879: done getting the remaining hosts for this loop 7554 1726853182.22883: getting the next task for host managed_node3 7554 1726853182.22890: done getting next task for host managed_node3 7554 1726853182.22894: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7554 1726853182.22896: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853182.22919: getting variables 7554 1726853182.22921: in VariableManager get_vars() 7554 1726853182.23188: Calling all_inventory to load vars for managed_node3 7554 1726853182.23192: Calling groups_inventory to load vars for managed_node3 7554 1726853182.23195: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853182.23205: Calling all_plugins_play to load vars for managed_node3 7554 1726853182.23209: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853182.23212: Calling groups_plugins_play to load vars for managed_node3 7554 1726853182.24874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853182.26520: done with get_vars() 7554 1726853182.26546: done getting variables 7554 1726853182.26626: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:26:22 -0400 (0:00:00.103) 0:00:36.234 ****** 7554 1726853182.26665: entering _queue_task() for managed_node3/dnf 7554 1726853182.27116: worker is 1 (out of 1 available) 7554 1726853182.27128: exiting _queue_task() for managed_node3/dnf 7554 1726853182.27140: done queuing things up, now waiting for results queue to drain 7554 1726853182.27141: waiting for pending results... 7554 1726853182.27491: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7554 1726853182.27515: in run() - task 02083763-bbaf-bdc3-98b6-0000000000bd 7554 1726853182.27536: variable 'ansible_search_path' from source: unknown 7554 1726853182.27544: variable 'ansible_search_path' from source: unknown 7554 1726853182.27594: calling self._execute() 7554 1726853182.27726: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853182.27747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853182.27769: variable 'omit' from source: magic vars 7554 1726853182.28178: variable 'ansible_distribution_major_version' from source: facts 7554 1726853182.28240: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853182.28417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853182.31178: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853182.31442: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853182.31445: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853182.31576: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853182.31589: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853182.31781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.31823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.32076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.32087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.32091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.32426: variable 'ansible_distribution' from source: facts 7554 1726853182.32430: variable 'ansible_distribution_major_version' from source: facts 7554 1726853182.32448: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7554 1726853182.32704: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853182.32990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.33173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.33177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.33181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.33184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.33319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.33495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.33498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.33501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.33503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.33643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.33749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.33783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.34074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.34077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.34295: variable 'network_connections' from source: task vars 7554 1726853182.34325: variable 'interface' from source: play vars 7554 1726853182.34414: variable 'interface' from source: play vars 7554 1726853182.34496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853182.34726: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853182.34768: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853182.34806: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853182.34846: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853182.34910: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853182.34957: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853182.35041: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.35062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853182.35126: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853182.35479: variable 'network_connections' from source: task vars 7554 1726853182.35483: variable 'interface' from source: play vars 7554 1726853182.35493: variable 'interface' from source: play vars 7554 1726853182.35535: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7554 1726853182.35543: when evaluation is False, skipping this task 7554 1726853182.35550: _execute() done 7554 1726853182.35556: dumping result to json 7554 1726853182.35562: done dumping result, returning 7554 1726853182.35577: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-0000000000bd] 7554 1726853182.35595: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000bd skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7554 1726853182.35860: no more pending results, returning what we have 7554 1726853182.35864: results queue empty 7554 1726853182.35865: checking for any_errors_fatal 7554 1726853182.35872: done checking for any_errors_fatal 7554 1726853182.35873: checking for max_fail_percentage 7554 1726853182.35874: done checking for max_fail_percentage 7554 1726853182.35876: checking to see if all hosts have failed and the running result is not ok 7554 1726853182.35877: done checking to see if all hosts have failed 7554 1726853182.35878: getting the remaining hosts for this loop 7554 1726853182.35879: done getting the remaining hosts for this loop 7554 1726853182.35883: getting the next task for host managed_node3 7554 1726853182.35890: done getting next task for host managed_node3 7554 1726853182.35894: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7554 1726853182.35897: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853182.35920: getting variables 7554 1726853182.35922: in VariableManager get_vars() 7554 1726853182.36088: Calling all_inventory to load vars for managed_node3 7554 1726853182.36091: Calling groups_inventory to load vars for managed_node3 7554 1726853182.36094: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853182.36105: Calling all_plugins_play to load vars for managed_node3 7554 1726853182.36108: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853182.36111: Calling groups_plugins_play to load vars for managed_node3 7554 1726853182.36685: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000bd 7554 1726853182.36688: WORKER PROCESS EXITING 7554 1726853182.37597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853182.39858: done with get_vars() 7554 1726853182.39892: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7554 1726853182.39977: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:26:22 -0400 (0:00:00.133) 0:00:36.367 ****** 7554 1726853182.40009: entering _queue_task() for managed_node3/yum 7554 1726853182.40481: worker is 1 (out of 1 available) 7554 1726853182.40493: exiting _queue_task() for managed_node3/yum 7554 1726853182.40504: done queuing things up, now waiting for results queue to drain 7554 1726853182.40506: waiting for pending results... 7554 1726853182.40713: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7554 1726853182.40897: in run() - task 02083763-bbaf-bdc3-98b6-0000000000be 7554 1726853182.40980: variable 'ansible_search_path' from source: unknown 7554 1726853182.40989: variable 'ansible_search_path' from source: unknown 7554 1726853182.41033: calling self._execute() 7554 1726853182.41163: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853182.41181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853182.41196: variable 'omit' from source: magic vars 7554 1726853182.41977: variable 'ansible_distribution_major_version' from source: facts 7554 1726853182.41980: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853182.42303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853182.45322: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853182.45396: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853182.45451: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853182.45504: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853182.45532: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853182.45617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.45650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.45980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.45983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.45985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.46048: variable 'ansible_distribution_major_version' from source: facts 7554 1726853182.46102: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7554 1726853182.46111: when evaluation is False, skipping this task 7554 1726853182.46118: _execute() done 7554 1726853182.46124: dumping result to json 7554 1726853182.46133: done dumping result, returning 7554 1726853182.46145: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-0000000000be] 7554 1726853182.46156: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000be skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7554 1726853182.46350: no more pending results, returning what we have 7554 1726853182.46354: results queue empty 7554 1726853182.46355: checking for any_errors_fatal 7554 1726853182.46363: done checking for any_errors_fatal 7554 1726853182.46363: checking for max_fail_percentage 7554 1726853182.46365: done checking for max_fail_percentage 7554 1726853182.46366: checking to see if all hosts have failed and the running result is not ok 7554 1726853182.46367: done checking to see if all hosts have failed 7554 1726853182.46368: getting the remaining hosts for this loop 7554 1726853182.46370: done getting the remaining hosts for this loop 7554 1726853182.46377: getting the next task for host managed_node3 7554 1726853182.46385: done getting next task for host managed_node3 7554 1726853182.46389: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7554 1726853182.46392: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853182.46417: getting variables 7554 1726853182.46419: in VariableManager get_vars() 7554 1726853182.46592: Calling all_inventory to load vars for managed_node3 7554 1726853182.46595: Calling groups_inventory to load vars for managed_node3 7554 1726853182.46598: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853182.46610: Calling all_plugins_play to load vars for managed_node3 7554 1726853182.46613: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853182.46616: Calling groups_plugins_play to load vars for managed_node3 7554 1726853182.47219: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000be 7554 1726853182.47222: WORKER PROCESS EXITING 7554 1726853182.56426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853182.58224: done with get_vars() 7554 1726853182.58268: done getting variables 7554 1726853182.58321: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:26:22 -0400 (0:00:00.183) 0:00:36.551 ****** 7554 1726853182.58360: entering _queue_task() for managed_node3/fail 7554 1726853182.58749: worker is 1 (out of 1 available) 7554 1726853182.58763: exiting _queue_task() for managed_node3/fail 7554 1726853182.59077: done queuing things up, now waiting for results queue to drain 7554 1726853182.59080: waiting for pending results... 7554 1726853182.59248: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7554 1726853182.59392: in run() - task 02083763-bbaf-bdc3-98b6-0000000000bf 7554 1726853182.59401: variable 'ansible_search_path' from source: unknown 7554 1726853182.59406: variable 'ansible_search_path' from source: unknown 7554 1726853182.59444: calling self._execute() 7554 1726853182.59535: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853182.59540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853182.59550: variable 'omit' from source: magic vars 7554 1726853182.59938: variable 'ansible_distribution_major_version' from source: facts 7554 1726853182.59959: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853182.60091: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853182.60320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853182.62782: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853182.62787: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853182.62824: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853182.62872: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853182.62911: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853182.63099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.63103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.63106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.63121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.63146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.63202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.63236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.63321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.63325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.63336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.63387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.63416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.63456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.63501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.63521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.63718: variable 'network_connections' from source: task vars 7554 1726853182.63754: variable 'interface' from source: play vars 7554 1726853182.63821: variable 'interface' from source: play vars 7554 1726853182.63973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853182.64105: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853182.64150: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853182.64192: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853182.64227: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853182.64279: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853182.64313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853182.64346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.64380: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853182.64456: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853182.64776: variable 'network_connections' from source: task vars 7554 1726853182.64780: variable 'interface' from source: play vars 7554 1726853182.64812: variable 'interface' from source: play vars 7554 1726853182.64861: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7554 1726853182.64870: when evaluation is False, skipping this task 7554 1726853182.64880: _execute() done 7554 1726853182.64887: dumping result to json 7554 1726853182.64894: done dumping result, returning 7554 1726853182.64906: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-0000000000bf] 7554 1726853182.64917: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000bf 7554 1726853182.65121: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000bf 7554 1726853182.65123: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7554 1726853182.65179: no more pending results, returning what we have 7554 1726853182.65182: results queue empty 7554 1726853182.65183: checking for any_errors_fatal 7554 1726853182.65194: done checking for any_errors_fatal 7554 1726853182.65194: checking for max_fail_percentage 7554 1726853182.65196: done checking for max_fail_percentage 7554 1726853182.65197: checking to see if all hosts have failed and the running result is not ok 7554 1726853182.65198: done checking to see if all hosts have failed 7554 1726853182.65198: getting the remaining hosts for this loop 7554 1726853182.65199: done getting the remaining hosts for this loop 7554 1726853182.65203: getting the next task for host managed_node3 7554 1726853182.65209: done getting next task for host managed_node3 7554 1726853182.65213: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7554 1726853182.65216: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853182.65476: getting variables 7554 1726853182.65478: in VariableManager get_vars() 7554 1726853182.65520: Calling all_inventory to load vars for managed_node3 7554 1726853182.65523: Calling groups_inventory to load vars for managed_node3 7554 1726853182.65525: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853182.65533: Calling all_plugins_play to load vars for managed_node3 7554 1726853182.65535: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853182.65538: Calling groups_plugins_play to load vars for managed_node3 7554 1726853182.66927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853182.68537: done with get_vars() 7554 1726853182.68566: done getting variables 7554 1726853182.68635: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:26:22 -0400 (0:00:00.103) 0:00:36.654 ****** 7554 1726853182.68675: entering _queue_task() for managed_node3/package 7554 1726853182.69022: worker is 1 (out of 1 available) 7554 1726853182.69033: exiting _queue_task() for managed_node3/package 7554 1726853182.69048: done queuing things up, now waiting for results queue to drain 7554 1726853182.69163: waiting for pending results... 7554 1726853182.69493: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7554 1726853182.69578: in run() - task 02083763-bbaf-bdc3-98b6-0000000000c0 7554 1726853182.69584: variable 'ansible_search_path' from source: unknown 7554 1726853182.69586: variable 'ansible_search_path' from source: unknown 7554 1726853182.69589: calling self._execute() 7554 1726853182.69704: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853182.69707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853182.69812: variable 'omit' from source: magic vars 7554 1726853182.70125: variable 'ansible_distribution_major_version' from source: facts 7554 1726853182.70152: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853182.70369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853182.70667: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853182.70727: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853182.70829: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853182.70876: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853182.71025: variable 'network_packages' from source: role '' defaults 7554 1726853182.71158: variable '__network_provider_setup' from source: role '' defaults 7554 1726853182.71177: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853182.71256: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853182.71273: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853182.71350: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853182.71576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853182.73919: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853182.74057: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853182.74061: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853182.74089: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853182.74120: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853182.74214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.74253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.74291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.74378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.74381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.74411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.74439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.74472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.74526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.74551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.74875: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7554 1726853182.74973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.75006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.75038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.75091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.75116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.75229: variable 'ansible_python' from source: facts 7554 1726853182.75476: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7554 1726853182.75479: variable '__network_wpa_supplicant_required' from source: role '' defaults 7554 1726853182.75482: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7554 1726853182.75581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.75615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.75645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.75688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.75716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.75764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853182.75797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853182.75830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.75869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853182.75890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853182.76063: variable 'network_connections' from source: task vars 7554 1726853182.76078: variable 'interface' from source: play vars 7554 1726853182.76195: variable 'interface' from source: play vars 7554 1726853182.76290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853182.76322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853182.76367: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853182.76406: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853182.76577: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853182.76787: variable 'network_connections' from source: task vars 7554 1726853182.76806: variable 'interface' from source: play vars 7554 1726853182.76918: variable 'interface' from source: play vars 7554 1726853182.76984: variable '__network_packages_default_wireless' from source: role '' defaults 7554 1726853182.77078: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853182.77419: variable 'network_connections' from source: task vars 7554 1726853182.77431: variable 'interface' from source: play vars 7554 1726853182.77562: variable 'interface' from source: play vars 7554 1726853182.77565: variable '__network_packages_default_team' from source: role '' defaults 7554 1726853182.77627: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853182.77958: variable 'network_connections' from source: task vars 7554 1726853182.77969: variable 'interface' from source: play vars 7554 1726853182.78258: variable 'interface' from source: play vars 7554 1726853182.78336: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853182.78407: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853182.78420: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853182.78576: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853182.78734: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7554 1726853182.79278: variable 'network_connections' from source: task vars 7554 1726853182.79291: variable 'interface' from source: play vars 7554 1726853182.79364: variable 'interface' from source: play vars 7554 1726853182.79384: variable 'ansible_distribution' from source: facts 7554 1726853182.79391: variable '__network_rh_distros' from source: role '' defaults 7554 1726853182.79401: variable 'ansible_distribution_major_version' from source: facts 7554 1726853182.79426: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7554 1726853182.79679: variable 'ansible_distribution' from source: facts 7554 1726853182.79682: variable '__network_rh_distros' from source: role '' defaults 7554 1726853182.79684: variable 'ansible_distribution_major_version' from source: facts 7554 1726853182.79687: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7554 1726853182.79817: variable 'ansible_distribution' from source: facts 7554 1726853182.79830: variable '__network_rh_distros' from source: role '' defaults 7554 1726853182.79841: variable 'ansible_distribution_major_version' from source: facts 7554 1726853182.79892: variable 'network_provider' from source: set_fact 7554 1726853182.79918: variable 'ansible_facts' from source: unknown 7554 1726853182.80776: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7554 1726853182.80780: when evaluation is False, skipping this task 7554 1726853182.80782: _execute() done 7554 1726853182.80785: dumping result to json 7554 1726853182.80787: done dumping result, returning 7554 1726853182.80789: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-bdc3-98b6-0000000000c0] 7554 1726853182.80791: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c0 7554 1726853182.80865: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c0 7554 1726853182.80869: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7554 1726853182.80924: no more pending results, returning what we have 7554 1726853182.80928: results queue empty 7554 1726853182.80929: checking for any_errors_fatal 7554 1726853182.80937: done checking for any_errors_fatal 7554 1726853182.80938: checking for max_fail_percentage 7554 1726853182.80940: done checking for max_fail_percentage 7554 1726853182.80941: checking to see if all hosts have failed and the running result is not ok 7554 1726853182.80944: done checking to see if all hosts have failed 7554 1726853182.80945: getting the remaining hosts for this loop 7554 1726853182.80947: done getting the remaining hosts for this loop 7554 1726853182.80951: getting the next task for host managed_node3 7554 1726853182.80959: done getting next task for host managed_node3 7554 1726853182.80963: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7554 1726853182.80966: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853182.81030: getting variables 7554 1726853182.81032: in VariableManager get_vars() 7554 1726853182.81221: Calling all_inventory to load vars for managed_node3 7554 1726853182.81224: Calling groups_inventory to load vars for managed_node3 7554 1726853182.81227: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853182.81237: Calling all_plugins_play to load vars for managed_node3 7554 1726853182.81240: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853182.81246: Calling groups_plugins_play to load vars for managed_node3 7554 1726853182.82995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853182.86093: done with get_vars() 7554 1726853182.86122: done getting variables 7554 1726853182.86292: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:26:22 -0400 (0:00:00.176) 0:00:36.830 ****** 7554 1726853182.86327: entering _queue_task() for managed_node3/package 7554 1726853182.87032: worker is 1 (out of 1 available) 7554 1726853182.87046: exiting _queue_task() for managed_node3/package 7554 1726853182.87057: done queuing things up, now waiting for results queue to drain 7554 1726853182.87059: waiting for pending results... 7554 1726853182.87357: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7554 1726853182.87530: in run() - task 02083763-bbaf-bdc3-98b6-0000000000c1 7554 1726853182.87546: variable 'ansible_search_path' from source: unknown 7554 1726853182.87550: variable 'ansible_search_path' from source: unknown 7554 1726853182.87653: calling self._execute() 7554 1726853182.87683: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853182.87688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853182.87699: variable 'omit' from source: magic vars 7554 1726853182.88076: variable 'ansible_distribution_major_version' from source: facts 7554 1726853182.88086: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853182.88202: variable 'network_state' from source: role '' defaults 7554 1726853182.88212: Evaluated conditional (network_state != {}): False 7554 1726853182.88215: when evaluation is False, skipping this task 7554 1726853182.88218: _execute() done 7554 1726853182.88222: dumping result to json 7554 1726853182.88224: done dumping result, returning 7554 1726853182.88233: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-bdc3-98b6-0000000000c1] 7554 1726853182.88239: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c1 7554 1726853182.88412: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c1 7554 1726853182.88415: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853182.88462: no more pending results, returning what we have 7554 1726853182.88464: results queue empty 7554 1726853182.88465: checking for any_errors_fatal 7554 1726853182.88469: done checking for any_errors_fatal 7554 1726853182.88470: checking for max_fail_percentage 7554 1726853182.88473: done checking for max_fail_percentage 7554 1726853182.88474: checking to see if all hosts have failed and the running result is not ok 7554 1726853182.88475: done checking to see if all hosts have failed 7554 1726853182.88476: getting the remaining hosts for this loop 7554 1726853182.88477: done getting the remaining hosts for this loop 7554 1726853182.88480: getting the next task for host managed_node3 7554 1726853182.88485: done getting next task for host managed_node3 7554 1726853182.88488: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7554 1726853182.88491: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853182.88508: getting variables 7554 1726853182.88509: in VariableManager get_vars() 7554 1726853182.88609: Calling all_inventory to load vars for managed_node3 7554 1726853182.88612: Calling groups_inventory to load vars for managed_node3 7554 1726853182.88614: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853182.88623: Calling all_plugins_play to load vars for managed_node3 7554 1726853182.88626: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853182.88629: Calling groups_plugins_play to load vars for managed_node3 7554 1726853182.90308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853182.92222: done with get_vars() 7554 1726853182.92247: done getting variables 7554 1726853182.92317: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:26:22 -0400 (0:00:00.060) 0:00:36.891 ****** 7554 1726853182.92355: entering _queue_task() for managed_node3/package 7554 1726853182.92900: worker is 1 (out of 1 available) 7554 1726853182.92919: exiting _queue_task() for managed_node3/package 7554 1726853182.92931: done queuing things up, now waiting for results queue to drain 7554 1726853182.92932: waiting for pending results... 7554 1726853182.93694: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7554 1726853182.93806: in run() - task 02083763-bbaf-bdc3-98b6-0000000000c2 7554 1726853182.93819: variable 'ansible_search_path' from source: unknown 7554 1726853182.93823: variable 'ansible_search_path' from source: unknown 7554 1726853182.93857: calling self._execute() 7554 1726853182.94064: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853182.94072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853182.94187: variable 'omit' from source: magic vars 7554 1726853182.94731: variable 'ansible_distribution_major_version' from source: facts 7554 1726853182.94746: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853182.95262: variable 'network_state' from source: role '' defaults 7554 1726853182.95307: Evaluated conditional (network_state != {}): False 7554 1726853182.95311: when evaluation is False, skipping this task 7554 1726853182.95313: _execute() done 7554 1726853182.95316: dumping result to json 7554 1726853182.95318: done dumping result, returning 7554 1726853182.95321: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-bdc3-98b6-0000000000c2] 7554 1726853182.95323: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c2 7554 1726853182.95400: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c2 7554 1726853182.95403: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853182.95456: no more pending results, returning what we have 7554 1726853182.95460: results queue empty 7554 1726853182.95460: checking for any_errors_fatal 7554 1726853182.95470: done checking for any_errors_fatal 7554 1726853182.95472: checking for max_fail_percentage 7554 1726853182.95474: done checking for max_fail_percentage 7554 1726853182.95475: checking to see if all hosts have failed and the running result is not ok 7554 1726853182.95476: done checking to see if all hosts have failed 7554 1726853182.95477: getting the remaining hosts for this loop 7554 1726853182.95478: done getting the remaining hosts for this loop 7554 1726853182.95482: getting the next task for host managed_node3 7554 1726853182.95488: done getting next task for host managed_node3 7554 1726853182.95492: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7554 1726853182.95496: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853182.95520: getting variables 7554 1726853182.95522: in VariableManager get_vars() 7554 1726853182.95568: Calling all_inventory to load vars for managed_node3 7554 1726853182.95676: Calling groups_inventory to load vars for managed_node3 7554 1726853182.95680: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853182.95691: Calling all_plugins_play to load vars for managed_node3 7554 1726853182.95694: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853182.95697: Calling groups_plugins_play to load vars for managed_node3 7554 1726853182.98500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853183.01753: done with get_vars() 7554 1726853183.01898: done getting variables 7554 1726853183.01962: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:26:23 -0400 (0:00:00.097) 0:00:36.988 ****** 7554 1726853183.02113: entering _queue_task() for managed_node3/service 7554 1726853183.02793: worker is 1 (out of 1 available) 7554 1726853183.02882: exiting _queue_task() for managed_node3/service 7554 1726853183.02893: done queuing things up, now waiting for results queue to drain 7554 1726853183.02895: waiting for pending results... 7554 1726853183.03359: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7554 1726853183.03674: in run() - task 02083763-bbaf-bdc3-98b6-0000000000c3 7554 1726853183.03678: variable 'ansible_search_path' from source: unknown 7554 1726853183.03682: variable 'ansible_search_path' from source: unknown 7554 1726853183.03852: calling self._execute() 7554 1726853183.03983: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853183.03987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853183.04107: variable 'omit' from source: magic vars 7554 1726853183.04889: variable 'ansible_distribution_major_version' from source: facts 7554 1726853183.04902: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853183.05161: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853183.05631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853183.09808: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853183.09904: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853183.09950: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853183.09997: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853183.10031: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853183.10127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853183.10166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853183.10199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853183.10253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853183.10276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853183.10336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853183.10431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853183.10435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853183.10448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853183.10468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853183.10516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853183.10553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853183.10583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853183.10625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853183.10651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853183.10840: variable 'network_connections' from source: task vars 7554 1726853183.10869: variable 'interface' from source: play vars 7554 1726853183.11037: variable 'interface' from source: play vars 7554 1726853183.11143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853183.11387: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853183.11848: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853183.11867: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853183.11902: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853183.11947: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853183.11982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853183.12009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853183.12030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853183.12103: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853183.12267: variable 'network_connections' from source: task vars 7554 1726853183.12272: variable 'interface' from source: play vars 7554 1726853183.12319: variable 'interface' from source: play vars 7554 1726853183.12347: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7554 1726853183.12351: when evaluation is False, skipping this task 7554 1726853183.12354: _execute() done 7554 1726853183.12356: dumping result to json 7554 1726853183.12358: done dumping result, returning 7554 1726853183.12363: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-0000000000c3] 7554 1726853183.12368: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c3 7554 1726853183.12461: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c3 7554 1726853183.12474: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7554 1726853183.12518: no more pending results, returning what we have 7554 1726853183.12522: results queue empty 7554 1726853183.12522: checking for any_errors_fatal 7554 1726853183.12528: done checking for any_errors_fatal 7554 1726853183.12529: checking for max_fail_percentage 7554 1726853183.12531: done checking for max_fail_percentage 7554 1726853183.12531: checking to see if all hosts have failed and the running result is not ok 7554 1726853183.12532: done checking to see if all hosts have failed 7554 1726853183.12533: getting the remaining hosts for this loop 7554 1726853183.12534: done getting the remaining hosts for this loop 7554 1726853183.12538: getting the next task for host managed_node3 7554 1726853183.12546: done getting next task for host managed_node3 7554 1726853183.12550: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7554 1726853183.12553: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853183.12574: getting variables 7554 1726853183.12576: in VariableManager get_vars() 7554 1726853183.12624: Calling all_inventory to load vars for managed_node3 7554 1726853183.12626: Calling groups_inventory to load vars for managed_node3 7554 1726853183.12628: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853183.12638: Calling all_plugins_play to load vars for managed_node3 7554 1726853183.12640: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853183.12645: Calling groups_plugins_play to load vars for managed_node3 7554 1726853183.14025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853183.15445: done with get_vars() 7554 1726853183.15472: done getting variables 7554 1726853183.15534: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:26:23 -0400 (0:00:00.134) 0:00:37.123 ****** 7554 1726853183.15568: entering _queue_task() for managed_node3/service 7554 1726853183.15893: worker is 1 (out of 1 available) 7554 1726853183.15913: exiting _queue_task() for managed_node3/service 7554 1726853183.15929: done queuing things up, now waiting for results queue to drain 7554 1726853183.15931: waiting for pending results... 7554 1726853183.16251: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7554 1726853183.16392: in run() - task 02083763-bbaf-bdc3-98b6-0000000000c4 7554 1726853183.16411: variable 'ansible_search_path' from source: unknown 7554 1726853183.16416: variable 'ansible_search_path' from source: unknown 7554 1726853183.16561: calling self._execute() 7554 1726853183.16574: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853183.16581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853183.16597: variable 'omit' from source: magic vars 7554 1726853183.17031: variable 'ansible_distribution_major_version' from source: facts 7554 1726853183.17043: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853183.17221: variable 'network_provider' from source: set_fact 7554 1726853183.17225: variable 'network_state' from source: role '' defaults 7554 1726853183.17231: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7554 1726853183.17237: variable 'omit' from source: magic vars 7554 1726853183.17308: variable 'omit' from source: magic vars 7554 1726853183.17342: variable 'network_service_name' from source: role '' defaults 7554 1726853183.17444: variable 'network_service_name' from source: role '' defaults 7554 1726853183.17547: variable '__network_provider_setup' from source: role '' defaults 7554 1726853183.17580: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853183.17661: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853183.17665: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853183.17701: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853183.18076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853183.19927: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853183.19999: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853183.20050: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853183.20076: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853183.20306: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853183.20384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853183.20412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853183.20435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853183.20697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853183.20704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853183.20754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853183.20783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853183.20805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853183.20844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853183.20855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853183.21257: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7554 1726853183.21381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853183.21421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853183.21459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853183.21506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853183.21510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853183.21683: variable 'ansible_python' from source: facts 7554 1726853183.21686: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7554 1726853183.21688: variable '__network_wpa_supplicant_required' from source: role '' defaults 7554 1726853183.21876: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7554 1726853183.21879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853183.21887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853183.21911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853183.21948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853183.21959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853183.22004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853183.22027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853183.22049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853183.22088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853183.22109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853183.22325: variable 'network_connections' from source: task vars 7554 1726853183.22328: variable 'interface' from source: play vars 7554 1726853183.22330: variable 'interface' from source: play vars 7554 1726853183.22415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853183.22611: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853183.22651: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853183.22682: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853183.22720: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853183.22766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853183.22789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853183.22810: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853183.22833: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853183.22869: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853183.23050: variable 'network_connections' from source: task vars 7554 1726853183.23055: variable 'interface' from source: play vars 7554 1726853183.23112: variable 'interface' from source: play vars 7554 1726853183.23148: variable '__network_packages_default_wireless' from source: role '' defaults 7554 1726853183.23225: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853183.23676: variable 'network_connections' from source: task vars 7554 1726853183.23680: variable 'interface' from source: play vars 7554 1726853183.23682: variable 'interface' from source: play vars 7554 1726853183.23684: variable '__network_packages_default_team' from source: role '' defaults 7554 1726853183.23687: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853183.23924: variable 'network_connections' from source: task vars 7554 1726853183.23927: variable 'interface' from source: play vars 7554 1726853183.23995: variable 'interface' from source: play vars 7554 1726853183.24061: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853183.24112: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853183.24119: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853183.24176: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853183.24408: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7554 1726853183.24935: variable 'network_connections' from source: task vars 7554 1726853183.24941: variable 'interface' from source: play vars 7554 1726853183.25004: variable 'interface' from source: play vars 7554 1726853183.25010: variable 'ansible_distribution' from source: facts 7554 1726853183.25013: variable '__network_rh_distros' from source: role '' defaults 7554 1726853183.25021: variable 'ansible_distribution_major_version' from source: facts 7554 1726853183.25044: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7554 1726853183.25222: variable 'ansible_distribution' from source: facts 7554 1726853183.25225: variable '__network_rh_distros' from source: role '' defaults 7554 1726853183.25228: variable 'ansible_distribution_major_version' from source: facts 7554 1726853183.25240: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7554 1726853183.25408: variable 'ansible_distribution' from source: facts 7554 1726853183.25411: variable '__network_rh_distros' from source: role '' defaults 7554 1726853183.25417: variable 'ansible_distribution_major_version' from source: facts 7554 1726853183.25480: variable 'network_provider' from source: set_fact 7554 1726853183.25483: variable 'omit' from source: magic vars 7554 1726853183.25512: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853183.25562: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853183.25566: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853183.25587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853183.25595: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853183.25618: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853183.25621: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853183.25624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853183.25698: Set connection var ansible_shell_executable to /bin/sh 7554 1726853183.25705: Set connection var ansible_pipelining to False 7554 1726853183.25707: Set connection var ansible_shell_type to sh 7554 1726853183.25710: Set connection var ansible_connection to ssh 7554 1726853183.25717: Set connection var ansible_timeout to 10 7554 1726853183.25721: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853183.25740: variable 'ansible_shell_executable' from source: unknown 7554 1726853183.25746: variable 'ansible_connection' from source: unknown 7554 1726853183.25748: variable 'ansible_module_compression' from source: unknown 7554 1726853183.25750: variable 'ansible_shell_type' from source: unknown 7554 1726853183.25752: variable 'ansible_shell_executable' from source: unknown 7554 1726853183.25756: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853183.25768: variable 'ansible_pipelining' from source: unknown 7554 1726853183.25773: variable 'ansible_timeout' from source: unknown 7554 1726853183.25777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853183.25854: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853183.25863: variable 'omit' from source: magic vars 7554 1726853183.25872: starting attempt loop 7554 1726853183.25876: running the handler 7554 1726853183.25931: variable 'ansible_facts' from source: unknown 7554 1726853183.26410: _low_level_execute_command(): starting 7554 1726853183.26416: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853183.26906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853183.26909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853183.26912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853183.26915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853183.26958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853183.26962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853183.27037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853183.28784: stdout chunk (state=3): >>>/root <<< 7554 1726853183.28878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853183.28904: stderr chunk (state=3): >>><<< 7554 1726853183.28908: stdout chunk (state=3): >>><<< 7554 1726853183.28926: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853183.28941: _low_level_execute_command(): starting 7554 1726853183.28944: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853183.2892575-8939-58298495718772 `" && echo ansible-tmp-1726853183.2892575-8939-58298495718772="` echo /root/.ansible/tmp/ansible-tmp-1726853183.2892575-8939-58298495718772 `" ) && sleep 0' 7554 1726853183.29354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853183.29367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853183.29370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853183.29393: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853183.29396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853183.29451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853183.29456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853183.29457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853183.29519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853183.31454: stdout chunk (state=3): >>>ansible-tmp-1726853183.2892575-8939-58298495718772=/root/.ansible/tmp/ansible-tmp-1726853183.2892575-8939-58298495718772 <<< 7554 1726853183.31561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853183.31590: stderr chunk (state=3): >>><<< 7554 1726853183.31593: stdout chunk (state=3): >>><<< 7554 1726853183.31608: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853183.2892575-8939-58298495718772=/root/.ansible/tmp/ansible-tmp-1726853183.2892575-8939-58298495718772 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853183.31633: variable 'ansible_module_compression' from source: unknown 7554 1726853183.31679: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 7554 1726853183.31733: variable 'ansible_facts' from source: unknown 7554 1726853183.31874: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853183.2892575-8939-58298495718772/AnsiballZ_systemd.py 7554 1726853183.31976: Sending initial data 7554 1726853183.31980: Sent initial data (153 bytes) 7554 1726853183.32443: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853183.32447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853183.32453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853183.32456: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853183.32458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853183.32503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853183.32506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853183.32513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853183.32574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853183.34203: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 7554 1726853183.34206: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853183.34257: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853183.34317: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpqallhch2 /root/.ansible/tmp/ansible-tmp-1726853183.2892575-8939-58298495718772/AnsiballZ_systemd.py <<< 7554 1726853183.34324: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853183.2892575-8939-58298495718772/AnsiballZ_systemd.py" <<< 7554 1726853183.34384: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpqallhch2" to remote "/root/.ansible/tmp/ansible-tmp-1726853183.2892575-8939-58298495718772/AnsiballZ_systemd.py" <<< 7554 1726853183.34387: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853183.2892575-8939-58298495718772/AnsiballZ_systemd.py" <<< 7554 1726853183.35556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853183.35599: stderr chunk (state=3): >>><<< 7554 1726853183.35602: stdout chunk (state=3): >>><<< 7554 1726853183.35616: done transferring module to remote 7554 1726853183.35625: _low_level_execute_command(): starting 7554 1726853183.35630: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853183.2892575-8939-58298495718772/ /root/.ansible/tmp/ansible-tmp-1726853183.2892575-8939-58298495718772/AnsiballZ_systemd.py && sleep 0' 7554 1726853183.36053: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853183.36067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853183.36070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853183.36085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853183.36135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853183.36139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853183.36209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853183.38066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853183.38085: stderr chunk (state=3): >>><<< 7554 1726853183.38088: stdout chunk (state=3): >>><<< 7554 1726853183.38099: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853183.38102: _low_level_execute_command(): starting 7554 1726853183.38107: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853183.2892575-8939-58298495718772/AnsiballZ_systemd.py && sleep 0' 7554 1726853183.38524: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853183.38528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853183.38530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853183.38532: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853183.38535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853183.38577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853183.38586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853183.38653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853183.68936: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9547776", "MemoryPeak": "10067968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3333996544", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "183192000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service multi-user.target cloud-init.service network.target", "After": "cloud-init-local.service network-pre.target dbus.socket system.slice sysinit.target dbus-broker.service basic.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:21:20 EDT", "StateChangeTimestampMonotonic": "24855925", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7554 1726853183.70884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853183.70888: stdout chunk (state=3): >>><<< 7554 1726853183.70890: stderr chunk (state=3): >>><<< 7554 1726853183.70894: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9547776", "MemoryPeak": "10067968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3333996544", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "183192000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service multi-user.target cloud-init.service network.target", "After": "cloud-init-local.service network-pre.target dbus.socket system.slice sysinit.target dbus-broker.service basic.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:21:20 EDT", "StateChangeTimestampMonotonic": "24855925", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853183.71175: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853183.2892575-8939-58298495718772/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853183.71196: _low_level_execute_command(): starting 7554 1726853183.71296: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853183.2892575-8939-58298495718772/ > /dev/null 2>&1 && sleep 0' 7554 1726853183.72598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853183.72603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853183.72805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853183.72870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853183.74787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853183.74791: stderr chunk (state=3): >>><<< 7554 1726853183.74794: stdout chunk (state=3): >>><<< 7554 1726853183.74812: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853183.74819: handler run complete 7554 1726853183.75082: attempt loop complete, returning result 7554 1726853183.75085: _execute() done 7554 1726853183.75088: dumping result to json 7554 1726853183.75109: done dumping result, returning 7554 1726853183.75120: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-bdc3-98b6-0000000000c4] 7554 1726853183.75125: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c4 7554 1726853183.75746: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c4 7554 1726853183.75751: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853183.75802: no more pending results, returning what we have 7554 1726853183.75805: results queue empty 7554 1726853183.75806: checking for any_errors_fatal 7554 1726853183.75810: done checking for any_errors_fatal 7554 1726853183.75811: checking for max_fail_percentage 7554 1726853183.75813: done checking for max_fail_percentage 7554 1726853183.75814: checking to see if all hosts have failed and the running result is not ok 7554 1726853183.75815: done checking to see if all hosts have failed 7554 1726853183.75815: getting the remaining hosts for this loop 7554 1726853183.75817: done getting the remaining hosts for this loop 7554 1726853183.75821: getting the next task for host managed_node3 7554 1726853183.75827: done getting next task for host managed_node3 7554 1726853183.75831: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7554 1726853183.75834: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853183.75843: getting variables 7554 1726853183.75845: in VariableManager get_vars() 7554 1726853183.75887: Calling all_inventory to load vars for managed_node3 7554 1726853183.75890: Calling groups_inventory to load vars for managed_node3 7554 1726853183.75892: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853183.75902: Calling all_plugins_play to load vars for managed_node3 7554 1726853183.75905: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853183.75907: Calling groups_plugins_play to load vars for managed_node3 7554 1726853183.78483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853183.80151: done with get_vars() 7554 1726853183.80200: done getting variables 7554 1726853183.80262: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:26:23 -0400 (0:00:00.647) 0:00:37.770 ****** 7554 1726853183.80304: entering _queue_task() for managed_node3/service 7554 1726853183.80810: worker is 1 (out of 1 available) 7554 1726853183.80822: exiting _queue_task() for managed_node3/service 7554 1726853183.80834: done queuing things up, now waiting for results queue to drain 7554 1726853183.80836: waiting for pending results... 7554 1726853183.81015: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7554 1726853183.81154: in run() - task 02083763-bbaf-bdc3-98b6-0000000000c5 7554 1726853183.81175: variable 'ansible_search_path' from source: unknown 7554 1726853183.81178: variable 'ansible_search_path' from source: unknown 7554 1726853183.81214: calling self._execute() 7554 1726853183.81376: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853183.81380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853183.81386: variable 'omit' from source: magic vars 7554 1726853183.81748: variable 'ansible_distribution_major_version' from source: facts 7554 1726853183.81767: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853183.81891: variable 'network_provider' from source: set_fact 7554 1726853183.81908: Evaluated conditional (network_provider == "nm"): True 7554 1726853183.82011: variable '__network_wpa_supplicant_required' from source: role '' defaults 7554 1726853183.82117: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7554 1726853183.82307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853183.84761: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853183.84877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853183.84881: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853183.84917: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853183.84952: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853183.85039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853183.85068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853183.85101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853183.85277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853183.85280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853183.85283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853183.85285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853183.85287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853183.85312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853183.85331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853183.85376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853183.85410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853183.85440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853183.85484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853183.85509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853183.85663: variable 'network_connections' from source: task vars 7554 1726853183.85686: variable 'interface' from source: play vars 7554 1726853183.85763: variable 'interface' from source: play vars 7554 1726853183.85949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853183.86044: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853183.86093: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853183.86127: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853183.86163: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853183.86223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853183.86278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853183.86287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853183.86317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853183.86393: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853183.86711: variable 'network_connections' from source: task vars 7554 1726853183.86714: variable 'interface' from source: play vars 7554 1726853183.86739: variable 'interface' from source: play vars 7554 1726853183.86786: Evaluated conditional (__network_wpa_supplicant_required): False 7554 1726853183.86808: when evaluation is False, skipping this task 7554 1726853183.86824: _execute() done 7554 1726853183.86839: dumping result to json 7554 1726853183.86847: done dumping result, returning 7554 1726853183.86859: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-bdc3-98b6-0000000000c5] 7554 1726853183.86907: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c5 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7554 1726853183.87217: no more pending results, returning what we have 7554 1726853183.87220: results queue empty 7554 1726853183.87221: checking for any_errors_fatal 7554 1726853183.87243: done checking for any_errors_fatal 7554 1726853183.87244: checking for max_fail_percentage 7554 1726853183.87246: done checking for max_fail_percentage 7554 1726853183.87247: checking to see if all hosts have failed and the running result is not ok 7554 1726853183.87248: done checking to see if all hosts have failed 7554 1726853183.87248: getting the remaining hosts for this loop 7554 1726853183.87250: done getting the remaining hosts for this loop 7554 1726853183.87253: getting the next task for host managed_node3 7554 1726853183.87259: done getting next task for host managed_node3 7554 1726853183.87263: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7554 1726853183.87265: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853183.87288: getting variables 7554 1726853183.87290: in VariableManager get_vars() 7554 1726853183.87340: Calling all_inventory to load vars for managed_node3 7554 1726853183.87342: Calling groups_inventory to load vars for managed_node3 7554 1726853183.87345: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853183.87355: Calling all_plugins_play to load vars for managed_node3 7554 1726853183.87359: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853183.87361: Calling groups_plugins_play to load vars for managed_node3 7554 1726853183.87896: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c5 7554 1726853183.87900: WORKER PROCESS EXITING 7554 1726853183.91007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853183.94775: done with get_vars() 7554 1726853183.94812: done getting variables 7554 1726853183.95380: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:26:23 -0400 (0:00:00.151) 0:00:37.921 ****** 7554 1726853183.95417: entering _queue_task() for managed_node3/service 7554 1726853183.96514: worker is 1 (out of 1 available) 7554 1726853183.96526: exiting _queue_task() for managed_node3/service 7554 1726853183.96539: done queuing things up, now waiting for results queue to drain 7554 1726853183.96540: waiting for pending results... 7554 1726853183.96701: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7554 1726853183.97152: in run() - task 02083763-bbaf-bdc3-98b6-0000000000c6 7554 1726853183.97169: variable 'ansible_search_path' from source: unknown 7554 1726853183.97182: variable 'ansible_search_path' from source: unknown 7554 1726853183.97221: calling self._execute() 7554 1726853183.97480: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853183.97487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853183.97505: variable 'omit' from source: magic vars 7554 1726853183.98417: variable 'ansible_distribution_major_version' from source: facts 7554 1726853183.98436: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853183.98726: variable 'network_provider' from source: set_fact 7554 1726853183.98738: Evaluated conditional (network_provider == "initscripts"): False 7554 1726853183.98750: when evaluation is False, skipping this task 7554 1726853183.98758: _execute() done 7554 1726853183.98766: dumping result to json 7554 1726853183.98776: done dumping result, returning 7554 1726853183.98877: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-bdc3-98b6-0000000000c6] 7554 1726853183.98880: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c6 7554 1726853183.99173: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c6 7554 1726853183.99177: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853183.99223: no more pending results, returning what we have 7554 1726853183.99227: results queue empty 7554 1726853183.99228: checking for any_errors_fatal 7554 1726853183.99237: done checking for any_errors_fatal 7554 1726853183.99237: checking for max_fail_percentage 7554 1726853183.99239: done checking for max_fail_percentage 7554 1726853183.99240: checking to see if all hosts have failed and the running result is not ok 7554 1726853183.99241: done checking to see if all hosts have failed 7554 1726853183.99242: getting the remaining hosts for this loop 7554 1726853183.99243: done getting the remaining hosts for this loop 7554 1726853183.99247: getting the next task for host managed_node3 7554 1726853183.99254: done getting next task for host managed_node3 7554 1726853183.99257: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7554 1726853183.99260: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853183.99286: getting variables 7554 1726853183.99288: in VariableManager get_vars() 7554 1726853183.99339: Calling all_inventory to load vars for managed_node3 7554 1726853183.99341: Calling groups_inventory to load vars for managed_node3 7554 1726853183.99343: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853183.99355: Calling all_plugins_play to load vars for managed_node3 7554 1726853183.99358: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853183.99360: Calling groups_plugins_play to load vars for managed_node3 7554 1726853184.01325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853184.03034: done with get_vars() 7554 1726853184.03058: done getting variables 7554 1726853184.03123: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:26:24 -0400 (0:00:00.077) 0:00:37.999 ****** 7554 1726853184.03163: entering _queue_task() for managed_node3/copy 7554 1726853184.03521: worker is 1 (out of 1 available) 7554 1726853184.03647: exiting _queue_task() for managed_node3/copy 7554 1726853184.03657: done queuing things up, now waiting for results queue to drain 7554 1726853184.03659: waiting for pending results... 7554 1726853184.03849: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7554 1726853184.04078: in run() - task 02083763-bbaf-bdc3-98b6-0000000000c7 7554 1726853184.04083: variable 'ansible_search_path' from source: unknown 7554 1726853184.04085: variable 'ansible_search_path' from source: unknown 7554 1726853184.04088: calling self._execute() 7554 1726853184.04161: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853184.04165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853184.04168: variable 'omit' from source: magic vars 7554 1726853184.04530: variable 'ansible_distribution_major_version' from source: facts 7554 1726853184.04548: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853184.04856: variable 'network_provider' from source: set_fact 7554 1726853184.04860: Evaluated conditional (network_provider == "initscripts"): False 7554 1726853184.04864: when evaluation is False, skipping this task 7554 1726853184.04867: _execute() done 7554 1726853184.04870: dumping result to json 7554 1726853184.04873: done dumping result, returning 7554 1726853184.04876: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-bdc3-98b6-0000000000c7] 7554 1726853184.04878: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c7 7554 1726853184.04940: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c7 7554 1726853184.04945: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7554 1726853184.04987: no more pending results, returning what we have 7554 1726853184.04990: results queue empty 7554 1726853184.04991: checking for any_errors_fatal 7554 1726853184.04995: done checking for any_errors_fatal 7554 1726853184.04996: checking for max_fail_percentage 7554 1726853184.04997: done checking for max_fail_percentage 7554 1726853184.04998: checking to see if all hosts have failed and the running result is not ok 7554 1726853184.04999: done checking to see if all hosts have failed 7554 1726853184.05000: getting the remaining hosts for this loop 7554 1726853184.05001: done getting the remaining hosts for this loop 7554 1726853184.05004: getting the next task for host managed_node3 7554 1726853184.05010: done getting next task for host managed_node3 7554 1726853184.05014: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7554 1726853184.05017: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853184.05036: getting variables 7554 1726853184.05038: in VariableManager get_vars() 7554 1726853184.05183: Calling all_inventory to load vars for managed_node3 7554 1726853184.05186: Calling groups_inventory to load vars for managed_node3 7554 1726853184.05188: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853184.05197: Calling all_plugins_play to load vars for managed_node3 7554 1726853184.05200: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853184.05203: Calling groups_plugins_play to load vars for managed_node3 7554 1726853184.06586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853184.08542: done with get_vars() 7554 1726853184.08778: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:26:24 -0400 (0:00:00.057) 0:00:38.056 ****** 7554 1726853184.08868: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7554 1726853184.09522: worker is 1 (out of 1 available) 7554 1726853184.09535: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7554 1726853184.09546: done queuing things up, now waiting for results queue to drain 7554 1726853184.09548: waiting for pending results... 7554 1726853184.10190: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7554 1726853184.10422: in run() - task 02083763-bbaf-bdc3-98b6-0000000000c8 7554 1726853184.10427: variable 'ansible_search_path' from source: unknown 7554 1726853184.10429: variable 'ansible_search_path' from source: unknown 7554 1726853184.10432: calling self._execute() 7554 1726853184.10594: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853184.10647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853184.10662: variable 'omit' from source: magic vars 7554 1726853184.11457: variable 'ansible_distribution_major_version' from source: facts 7554 1726853184.11518: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853184.11533: variable 'omit' from source: magic vars 7554 1726853184.11719: variable 'omit' from source: magic vars 7554 1726853184.12002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853184.16689: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853184.16877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853184.16881: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853184.16968: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853184.17003: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853184.17095: variable 'network_provider' from source: set_fact 7554 1726853184.17498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853184.18363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853184.18459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853184.18758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853184.18761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853184.18764: variable 'omit' from source: magic vars 7554 1726853184.18953: variable 'omit' from source: magic vars 7554 1726853184.19295: variable 'network_connections' from source: task vars 7554 1726853184.19313: variable 'interface' from source: play vars 7554 1726853184.19441: variable 'interface' from source: play vars 7554 1726853184.19758: variable 'omit' from source: magic vars 7554 1726853184.19761: variable '__lsr_ansible_managed' from source: task vars 7554 1726853184.19986: variable '__lsr_ansible_managed' from source: task vars 7554 1726853184.20281: Loaded config def from plugin (lookup/template) 7554 1726853184.20324: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7554 1726853184.20346: File lookup term: get_ansible_managed.j2 7554 1726853184.20376: variable 'ansible_search_path' from source: unknown 7554 1726853184.20380: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7554 1726853184.20384: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7554 1726853184.20405: variable 'ansible_search_path' from source: unknown 7554 1726853184.28144: variable 'ansible_managed' from source: unknown 7554 1726853184.28258: variable 'omit' from source: magic vars 7554 1726853184.28283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853184.28303: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853184.28323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853184.28333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853184.28348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853184.28366: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853184.28369: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853184.28373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853184.28435: Set connection var ansible_shell_executable to /bin/sh 7554 1726853184.28446: Set connection var ansible_pipelining to False 7554 1726853184.28449: Set connection var ansible_shell_type to sh 7554 1726853184.28452: Set connection var ansible_connection to ssh 7554 1726853184.28460: Set connection var ansible_timeout to 10 7554 1726853184.28465: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853184.28483: variable 'ansible_shell_executable' from source: unknown 7554 1726853184.28486: variable 'ansible_connection' from source: unknown 7554 1726853184.28488: variable 'ansible_module_compression' from source: unknown 7554 1726853184.28490: variable 'ansible_shell_type' from source: unknown 7554 1726853184.28492: variable 'ansible_shell_executable' from source: unknown 7554 1726853184.28494: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853184.28499: variable 'ansible_pipelining' from source: unknown 7554 1726853184.28502: variable 'ansible_timeout' from source: unknown 7554 1726853184.28506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853184.28603: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853184.28614: variable 'omit' from source: magic vars 7554 1726853184.28617: starting attempt loop 7554 1726853184.28621: running the handler 7554 1726853184.28632: _low_level_execute_command(): starting 7554 1726853184.28638: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853184.29119: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853184.29123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853184.29126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853184.29128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853184.29166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853184.29174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853184.29256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853184.30982: stdout chunk (state=3): >>>/root <<< 7554 1726853184.31091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853184.31148: stderr chunk (state=3): >>><<< 7554 1726853184.31176: stdout chunk (state=3): >>><<< 7554 1726853184.31184: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853184.31199: _low_level_execute_command(): starting 7554 1726853184.31273: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853184.3118956-8971-255701149045239 `" && echo ansible-tmp-1726853184.3118956-8971-255701149045239="` echo /root/.ansible/tmp/ansible-tmp-1726853184.3118956-8971-255701149045239 `" ) && sleep 0' 7554 1726853184.31767: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853184.31788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853184.31794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853184.31815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853184.31829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853184.31870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853184.31885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853184.31952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853184.34078: stdout chunk (state=3): >>>ansible-tmp-1726853184.3118956-8971-255701149045239=/root/.ansible/tmp/ansible-tmp-1726853184.3118956-8971-255701149045239 <<< 7554 1726853184.34082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853184.34149: stderr chunk (state=3): >>><<< 7554 1726853184.34153: stdout chunk (state=3): >>><<< 7554 1726853184.34198: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853184.3118956-8971-255701149045239=/root/.ansible/tmp/ansible-tmp-1726853184.3118956-8971-255701149045239 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853184.34383: variable 'ansible_module_compression' from source: unknown 7554 1726853184.34476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 7554 1726853184.34526: variable 'ansible_facts' from source: unknown 7554 1726853184.34596: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853184.3118956-8971-255701149045239/AnsiballZ_network_connections.py 7554 1726853184.34702: Sending initial data 7554 1726853184.34706: Sent initial data (166 bytes) 7554 1726853184.35126: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853184.35129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853184.35135: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853184.35137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853184.35139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853184.35177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853184.35188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853184.35251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853184.36885: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853184.36955: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853184.37009: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpnj7zl6mu /root/.ansible/tmp/ansible-tmp-1726853184.3118956-8971-255701149045239/AnsiballZ_network_connections.py <<< 7554 1726853184.37012: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853184.3118956-8971-255701149045239/AnsiballZ_network_connections.py" <<< 7554 1726853184.37065: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpnj7zl6mu" to remote "/root/.ansible/tmp/ansible-tmp-1726853184.3118956-8971-255701149045239/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853184.3118956-8971-255701149045239/AnsiballZ_network_connections.py" <<< 7554 1726853184.38135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853184.38175: stderr chunk (state=3): >>><<< 7554 1726853184.38184: stdout chunk (state=3): >>><<< 7554 1726853184.38214: done transferring module to remote 7554 1726853184.38224: _low_level_execute_command(): starting 7554 1726853184.38228: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853184.3118956-8971-255701149045239/ /root/.ansible/tmp/ansible-tmp-1726853184.3118956-8971-255701149045239/AnsiballZ_network_connections.py && sleep 0' 7554 1726853184.38628: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853184.38660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853184.38664: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853184.38666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853184.38668: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853184.38670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853184.38676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853184.38721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853184.38724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853184.38796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853184.40700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853184.40703: stderr chunk (state=3): >>><<< 7554 1726853184.40706: stdout chunk (state=3): >>><<< 7554 1726853184.40744: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853184.40748: _low_level_execute_command(): starting 7554 1726853184.40750: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853184.3118956-8971-255701149045239/AnsiballZ_network_connections.py && sleep 0' 7554 1726853184.41205: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853184.41208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853184.41210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853184.41212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853184.41215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853184.41261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853184.41274: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853184.41336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853184.87723: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, bae577e5-e110-4880-a74b-012e4e387e44\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, bae577e5-e110-4880-a74b-012e4e387e44 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7554 1726853184.89826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853184.89830: stdout chunk (state=3): >>><<< 7554 1726853184.89833: stderr chunk (state=3): >>><<< 7554 1726853184.89857: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, bae577e5-e110-4880-a74b-012e4e387e44\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, bae577e5-e110-4880-a74b-012e4e387e44 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853184.89936: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'auto_gateway': False, 'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/64', '203.0.113.2/24'], 'gateway6': '2001:db8::1', 'gateway4': '203.0.113.1'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853184.3118956-8971-255701149045239/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853184.89940: _low_level_execute_command(): starting 7554 1726853184.89947: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853184.3118956-8971-255701149045239/ > /dev/null 2>&1 && sleep 0' 7554 1726853184.90696: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853184.90724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853184.90744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853184.90766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853184.90861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853184.92819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853184.92846: stdout chunk (state=3): >>><<< 7554 1726853184.92850: stderr chunk (state=3): >>><<< 7554 1726853184.92980: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853184.92988: handler run complete 7554 1726853184.92991: attempt loop complete, returning result 7554 1726853184.92993: _execute() done 7554 1726853184.92995: dumping result to json 7554 1726853184.92997: done dumping result, returning 7554 1726853184.92999: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-bdc3-98b6-0000000000c8] 7554 1726853184.93001: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c8 changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": false, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, bae577e5-e110-4880-a74b-012e4e387e44 [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, bae577e5-e110-4880-a74b-012e4e387e44 (not-active) 7554 1726853184.93416: no more pending results, returning what we have 7554 1726853184.93420: results queue empty 7554 1726853184.93421: checking for any_errors_fatal 7554 1726853184.93428: done checking for any_errors_fatal 7554 1726853184.93429: checking for max_fail_percentage 7554 1726853184.93430: done checking for max_fail_percentage 7554 1726853184.93431: checking to see if all hosts have failed and the running result is not ok 7554 1726853184.93432: done checking to see if all hosts have failed 7554 1726853184.93433: getting the remaining hosts for this loop 7554 1726853184.93435: done getting the remaining hosts for this loop 7554 1726853184.93438: getting the next task for host managed_node3 7554 1726853184.93446: done getting next task for host managed_node3 7554 1726853184.93451: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7554 1726853184.93454: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853184.93465: getting variables 7554 1726853184.93466: in VariableManager get_vars() 7554 1726853184.93519: Calling all_inventory to load vars for managed_node3 7554 1726853184.93522: Calling groups_inventory to load vars for managed_node3 7554 1726853184.93524: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853184.93531: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c8 7554 1726853184.93534: WORKER PROCESS EXITING 7554 1726853184.93545: Calling all_plugins_play to load vars for managed_node3 7554 1726853184.93548: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853184.93551: Calling groups_plugins_play to load vars for managed_node3 7554 1726853184.95379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853184.97008: done with get_vars() 7554 1726853184.97045: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:26:24 -0400 (0:00:00.882) 0:00:38.938 ****** 7554 1726853184.97141: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7554 1726853184.97618: worker is 1 (out of 1 available) 7554 1726853184.97631: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7554 1726853184.97646: done queuing things up, now waiting for results queue to drain 7554 1726853184.97648: waiting for pending results... 7554 1726853184.97886: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7554 1726853184.98035: in run() - task 02083763-bbaf-bdc3-98b6-0000000000c9 7554 1726853184.98062: variable 'ansible_search_path' from source: unknown 7554 1726853184.98069: variable 'ansible_search_path' from source: unknown 7554 1726853184.98109: calling self._execute() 7554 1726853184.98212: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853184.98223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853184.98235: variable 'omit' from source: magic vars 7554 1726853184.98640: variable 'ansible_distribution_major_version' from source: facts 7554 1726853184.98662: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853184.98792: variable 'network_state' from source: role '' defaults 7554 1726853184.98816: Evaluated conditional (network_state != {}): False 7554 1726853184.98824: when evaluation is False, skipping this task 7554 1726853184.98830: _execute() done 7554 1726853184.98837: dumping result to json 7554 1726853184.98847: done dumping result, returning 7554 1726853184.98912: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-bdc3-98b6-0000000000c9] 7554 1726853184.98916: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c9 7554 1726853184.98998: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000c9 7554 1726853184.99001: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853184.99081: no more pending results, returning what we have 7554 1726853184.99086: results queue empty 7554 1726853184.99087: checking for any_errors_fatal 7554 1726853184.99099: done checking for any_errors_fatal 7554 1726853184.99100: checking for max_fail_percentage 7554 1726853184.99102: done checking for max_fail_percentage 7554 1726853184.99103: checking to see if all hosts have failed and the running result is not ok 7554 1726853184.99104: done checking to see if all hosts have failed 7554 1726853184.99105: getting the remaining hosts for this loop 7554 1726853184.99107: done getting the remaining hosts for this loop 7554 1726853184.99111: getting the next task for host managed_node3 7554 1726853184.99118: done getting next task for host managed_node3 7554 1726853184.99237: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7554 1726853184.99241: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853184.99270: getting variables 7554 1726853184.99475: in VariableManager get_vars() 7554 1726853184.99528: Calling all_inventory to load vars for managed_node3 7554 1726853184.99531: Calling groups_inventory to load vars for managed_node3 7554 1726853184.99534: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853184.99549: Calling all_plugins_play to load vars for managed_node3 7554 1726853184.99553: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853184.99556: Calling groups_plugins_play to load vars for managed_node3 7554 1726853185.02394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853185.04104: done with get_vars() 7554 1726853185.04145: done getting variables 7554 1726853185.04214: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:26:25 -0400 (0:00:00.071) 0:00:39.010 ****** 7554 1726853185.04261: entering _queue_task() for managed_node3/debug 7554 1726853185.04646: worker is 1 (out of 1 available) 7554 1726853185.04661: exiting _queue_task() for managed_node3/debug 7554 1726853185.04796: done queuing things up, now waiting for results queue to drain 7554 1726853185.04798: waiting for pending results... 7554 1726853185.05092: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7554 1726853185.05231: in run() - task 02083763-bbaf-bdc3-98b6-0000000000ca 7554 1726853185.05236: variable 'ansible_search_path' from source: unknown 7554 1726853185.05238: variable 'ansible_search_path' from source: unknown 7554 1726853185.05250: calling self._execute() 7554 1726853185.05363: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853185.05380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853185.05399: variable 'omit' from source: magic vars 7554 1726853185.05815: variable 'ansible_distribution_major_version' from source: facts 7554 1726853185.05840: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853185.05885: variable 'omit' from source: magic vars 7554 1726853185.05914: variable 'omit' from source: magic vars 7554 1726853185.05960: variable 'omit' from source: magic vars 7554 1726853185.06013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853185.06061: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853185.06092: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853185.06169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853185.06174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853185.06184: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853185.06193: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853185.06202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853185.06329: Set connection var ansible_shell_executable to /bin/sh 7554 1726853185.06349: Set connection var ansible_pipelining to False 7554 1726853185.06358: Set connection var ansible_shell_type to sh 7554 1726853185.06366: Set connection var ansible_connection to ssh 7554 1726853185.06429: Set connection var ansible_timeout to 10 7554 1726853185.06432: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853185.06435: variable 'ansible_shell_executable' from source: unknown 7554 1726853185.06437: variable 'ansible_connection' from source: unknown 7554 1726853185.06439: variable 'ansible_module_compression' from source: unknown 7554 1726853185.06441: variable 'ansible_shell_type' from source: unknown 7554 1726853185.06445: variable 'ansible_shell_executable' from source: unknown 7554 1726853185.06447: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853185.06455: variable 'ansible_pipelining' from source: unknown 7554 1726853185.06461: variable 'ansible_timeout' from source: unknown 7554 1726853185.06468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853185.06625: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853185.06677: variable 'omit' from source: magic vars 7554 1726853185.06681: starting attempt loop 7554 1726853185.06684: running the handler 7554 1726853185.06823: variable '__network_connections_result' from source: set_fact 7554 1726853185.06891: handler run complete 7554 1726853185.06912: attempt loop complete, returning result 7554 1726853185.06929: _execute() done 7554 1726853185.06932: dumping result to json 7554 1726853185.06935: done dumping result, returning 7554 1726853185.06979: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-bdc3-98b6-0000000000ca] 7554 1726853185.06982: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000ca ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, bae577e5-e110-4880-a74b-012e4e387e44", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, bae577e5-e110-4880-a74b-012e4e387e44 (not-active)" ] } 7554 1726853185.07265: no more pending results, returning what we have 7554 1726853185.07268: results queue empty 7554 1726853185.07269: checking for any_errors_fatal 7554 1726853185.07280: done checking for any_errors_fatal 7554 1726853185.07281: checking for max_fail_percentage 7554 1726853185.07282: done checking for max_fail_percentage 7554 1726853185.07283: checking to see if all hosts have failed and the running result is not ok 7554 1726853185.07284: done checking to see if all hosts have failed 7554 1726853185.07285: getting the remaining hosts for this loop 7554 1726853185.07287: done getting the remaining hosts for this loop 7554 1726853185.07291: getting the next task for host managed_node3 7554 1726853185.07303: done getting next task for host managed_node3 7554 1726853185.07307: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7554 1726853185.07311: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853185.07383: getting variables 7554 1726853185.07386: in VariableManager get_vars() 7554 1726853185.07448: Calling all_inventory to load vars for managed_node3 7554 1726853185.07451: Calling groups_inventory to load vars for managed_node3 7554 1726853185.07454: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853185.07525: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000ca 7554 1726853185.07529: WORKER PROCESS EXITING 7554 1726853185.07539: Calling all_plugins_play to load vars for managed_node3 7554 1726853185.07545: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853185.07550: Calling groups_plugins_play to load vars for managed_node3 7554 1726853185.09243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853185.10744: done with get_vars() 7554 1726853185.10779: done getting variables 7554 1726853185.10841: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:26:25 -0400 (0:00:00.066) 0:00:39.076 ****** 7554 1726853185.10878: entering _queue_task() for managed_node3/debug 7554 1726853185.11232: worker is 1 (out of 1 available) 7554 1726853185.11244: exiting _queue_task() for managed_node3/debug 7554 1726853185.11257: done queuing things up, now waiting for results queue to drain 7554 1726853185.11258: waiting for pending results... 7554 1726853185.11553: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7554 1726853185.11669: in run() - task 02083763-bbaf-bdc3-98b6-0000000000cb 7554 1726853185.11691: variable 'ansible_search_path' from source: unknown 7554 1726853185.11696: variable 'ansible_search_path' from source: unknown 7554 1726853185.11730: calling self._execute() 7554 1726853185.11825: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853185.11831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853185.11912: variable 'omit' from source: magic vars 7554 1726853185.12227: variable 'ansible_distribution_major_version' from source: facts 7554 1726853185.12238: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853185.12249: variable 'omit' from source: magic vars 7554 1726853185.12317: variable 'omit' from source: magic vars 7554 1726853185.12356: variable 'omit' from source: magic vars 7554 1726853185.12400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853185.12435: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853185.12458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853185.12481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853185.12493: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853185.12524: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853185.12527: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853185.12530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853185.12686: Set connection var ansible_shell_executable to /bin/sh 7554 1726853185.12689: Set connection var ansible_pipelining to False 7554 1726853185.12691: Set connection var ansible_shell_type to sh 7554 1726853185.12694: Set connection var ansible_connection to ssh 7554 1726853185.12696: Set connection var ansible_timeout to 10 7554 1726853185.12698: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853185.12700: variable 'ansible_shell_executable' from source: unknown 7554 1726853185.12702: variable 'ansible_connection' from source: unknown 7554 1726853185.12705: variable 'ansible_module_compression' from source: unknown 7554 1726853185.12707: variable 'ansible_shell_type' from source: unknown 7554 1726853185.12708: variable 'ansible_shell_executable' from source: unknown 7554 1726853185.12710: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853185.12712: variable 'ansible_pipelining' from source: unknown 7554 1726853185.12714: variable 'ansible_timeout' from source: unknown 7554 1726853185.12716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853185.12849: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853185.12861: variable 'omit' from source: magic vars 7554 1726853185.12866: starting attempt loop 7554 1726853185.12872: running the handler 7554 1726853185.12940: variable '__network_connections_result' from source: set_fact 7554 1726853185.13176: variable '__network_connections_result' from source: set_fact 7554 1726853185.13179: handler run complete 7554 1726853185.13182: attempt loop complete, returning result 7554 1726853185.13184: _execute() done 7554 1726853185.13187: dumping result to json 7554 1726853185.13189: done dumping result, returning 7554 1726853185.13191: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-bdc3-98b6-0000000000cb] 7554 1726853185.13193: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000cb 7554 1726853185.13294: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000cb 7554 1726853185.13297: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": false, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, bae577e5-e110-4880-a74b-012e4e387e44\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, bae577e5-e110-4880-a74b-012e4e387e44 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, bae577e5-e110-4880-a74b-012e4e387e44", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, bae577e5-e110-4880-a74b-012e4e387e44 (not-active)" ] } } 7554 1726853185.13407: no more pending results, returning what we have 7554 1726853185.13411: results queue empty 7554 1726853185.13412: checking for any_errors_fatal 7554 1726853185.13422: done checking for any_errors_fatal 7554 1726853185.13423: checking for max_fail_percentage 7554 1726853185.13424: done checking for max_fail_percentage 7554 1726853185.13426: checking to see if all hosts have failed and the running result is not ok 7554 1726853185.13427: done checking to see if all hosts have failed 7554 1726853185.13428: getting the remaining hosts for this loop 7554 1726853185.13429: done getting the remaining hosts for this loop 7554 1726853185.13433: getting the next task for host managed_node3 7554 1726853185.13440: done getting next task for host managed_node3 7554 1726853185.13443: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7554 1726853185.13447: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853185.13460: getting variables 7554 1726853185.13462: in VariableManager get_vars() 7554 1726853185.13519: Calling all_inventory to load vars for managed_node3 7554 1726853185.13522: Calling groups_inventory to load vars for managed_node3 7554 1726853185.13525: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853185.13536: Calling all_plugins_play to load vars for managed_node3 7554 1726853185.13539: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853185.13542: Calling groups_plugins_play to load vars for managed_node3 7554 1726853185.15093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853185.16782: done with get_vars() 7554 1726853185.16804: done getting variables 7554 1726853185.16862: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:26:25 -0400 (0:00:00.060) 0:00:39.136 ****** 7554 1726853185.16898: entering _queue_task() for managed_node3/debug 7554 1726853185.17230: worker is 1 (out of 1 available) 7554 1726853185.17242: exiting _queue_task() for managed_node3/debug 7554 1726853185.17253: done queuing things up, now waiting for results queue to drain 7554 1726853185.17255: waiting for pending results... 7554 1726853185.17634: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7554 1726853185.17686: in run() - task 02083763-bbaf-bdc3-98b6-0000000000cc 7554 1726853185.17705: variable 'ansible_search_path' from source: unknown 7554 1726853185.17708: variable 'ansible_search_path' from source: unknown 7554 1726853185.17741: calling self._execute() 7554 1726853185.17838: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853185.17847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853185.17854: variable 'omit' from source: magic vars 7554 1726853185.18233: variable 'ansible_distribution_major_version' from source: facts 7554 1726853185.18249: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853185.18356: variable 'network_state' from source: role '' defaults 7554 1726853185.18367: Evaluated conditional (network_state != {}): False 7554 1726853185.18370: when evaluation is False, skipping this task 7554 1726853185.18375: _execute() done 7554 1726853185.18382: dumping result to json 7554 1726853185.18386: done dumping result, returning 7554 1726853185.18389: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-bdc3-98b6-0000000000cc] 7554 1726853185.18493: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000cc 7554 1726853185.18556: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000cc 7554 1726853185.18559: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7554 1726853185.18616: no more pending results, returning what we have 7554 1726853185.18621: results queue empty 7554 1726853185.18622: checking for any_errors_fatal 7554 1726853185.18635: done checking for any_errors_fatal 7554 1726853185.18636: checking for max_fail_percentage 7554 1726853185.18638: done checking for max_fail_percentage 7554 1726853185.18639: checking to see if all hosts have failed and the running result is not ok 7554 1726853185.18640: done checking to see if all hosts have failed 7554 1726853185.18641: getting the remaining hosts for this loop 7554 1726853185.18643: done getting the remaining hosts for this loop 7554 1726853185.18647: getting the next task for host managed_node3 7554 1726853185.18654: done getting next task for host managed_node3 7554 1726853185.18658: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7554 1726853185.18662: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853185.18691: getting variables 7554 1726853185.18693: in VariableManager get_vars() 7554 1726853185.18747: Calling all_inventory to load vars for managed_node3 7554 1726853185.18750: Calling groups_inventory to load vars for managed_node3 7554 1726853185.18752: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853185.18767: Calling all_plugins_play to load vars for managed_node3 7554 1726853185.18957: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853185.18964: Calling groups_plugins_play to load vars for managed_node3 7554 1726853185.20249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853185.21741: done with get_vars() 7554 1726853185.21764: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:26:25 -0400 (0:00:00.049) 0:00:39.186 ****** 7554 1726853185.21855: entering _queue_task() for managed_node3/ping 7554 1726853185.22290: worker is 1 (out of 1 available) 7554 1726853185.22301: exiting _queue_task() for managed_node3/ping 7554 1726853185.22310: done queuing things up, now waiting for results queue to drain 7554 1726853185.22312: waiting for pending results... 7554 1726853185.22562: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7554 1726853185.22721: in run() - task 02083763-bbaf-bdc3-98b6-0000000000cd 7554 1726853185.22725: variable 'ansible_search_path' from source: unknown 7554 1726853185.22728: variable 'ansible_search_path' from source: unknown 7554 1726853185.22731: calling self._execute() 7554 1726853185.22809: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853185.22817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853185.22822: variable 'omit' from source: magic vars 7554 1726853185.23260: variable 'ansible_distribution_major_version' from source: facts 7554 1726853185.23265: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853185.23268: variable 'omit' from source: magic vars 7554 1726853185.23272: variable 'omit' from source: magic vars 7554 1726853185.23312: variable 'omit' from source: magic vars 7554 1726853185.23355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853185.23392: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853185.23477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853185.23481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853185.23483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853185.23485: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853185.23487: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853185.23491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853185.23591: Set connection var ansible_shell_executable to /bin/sh 7554 1726853185.23594: Set connection var ansible_pipelining to False 7554 1726853185.23597: Set connection var ansible_shell_type to sh 7554 1726853185.23603: Set connection var ansible_connection to ssh 7554 1726853185.23609: Set connection var ansible_timeout to 10 7554 1726853185.23614: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853185.23646: variable 'ansible_shell_executable' from source: unknown 7554 1726853185.23649: variable 'ansible_connection' from source: unknown 7554 1726853185.23652: variable 'ansible_module_compression' from source: unknown 7554 1726853185.23654: variable 'ansible_shell_type' from source: unknown 7554 1726853185.23656: variable 'ansible_shell_executable' from source: unknown 7554 1726853185.23658: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853185.23661: variable 'ansible_pipelining' from source: unknown 7554 1726853185.23663: variable 'ansible_timeout' from source: unknown 7554 1726853185.23695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853185.23870: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853185.23912: variable 'omit' from source: magic vars 7554 1726853185.23915: starting attempt loop 7554 1726853185.23918: running the handler 7554 1726853185.23920: _low_level_execute_command(): starting 7554 1726853185.23922: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853185.24678: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853185.24687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853185.24761: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853185.24811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853185.24884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853185.26575: stdout chunk (state=3): >>>/root <<< 7554 1726853185.26689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853185.26708: stderr chunk (state=3): >>><<< 7554 1726853185.26711: stdout chunk (state=3): >>><<< 7554 1726853185.26733: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853185.26747: _low_level_execute_command(): starting 7554 1726853185.26751: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853185.2673216-9009-188795829289428 `" && echo ansible-tmp-1726853185.2673216-9009-188795829289428="` echo /root/.ansible/tmp/ansible-tmp-1726853185.2673216-9009-188795829289428 `" ) && sleep 0' 7554 1726853185.27223: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853185.27226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853185.27228: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853185.27230: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853185.27240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853185.27243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853185.27320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853185.27324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853185.27401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853185.29379: stdout chunk (state=3): >>>ansible-tmp-1726853185.2673216-9009-188795829289428=/root/.ansible/tmp/ansible-tmp-1726853185.2673216-9009-188795829289428 <<< 7554 1726853185.29482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853185.29510: stderr chunk (state=3): >>><<< 7554 1726853185.29513: stdout chunk (state=3): >>><<< 7554 1726853185.29531: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853185.2673216-9009-188795829289428=/root/.ansible/tmp/ansible-tmp-1726853185.2673216-9009-188795829289428 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853185.29575: variable 'ansible_module_compression' from source: unknown 7554 1726853185.29607: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 7554 1726853185.29638: variable 'ansible_facts' from source: unknown 7554 1726853185.29694: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853185.2673216-9009-188795829289428/AnsiballZ_ping.py 7554 1726853185.29797: Sending initial data 7554 1726853185.29800: Sent initial data (151 bytes) 7554 1726853185.30409: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853185.30448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853185.30536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853185.32212: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853185.32272: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853185.32324: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpwhvszlgq /root/.ansible/tmp/ansible-tmp-1726853185.2673216-9009-188795829289428/AnsiballZ_ping.py <<< 7554 1726853185.32332: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853185.2673216-9009-188795829289428/AnsiballZ_ping.py" <<< 7554 1726853185.32385: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpwhvszlgq" to remote "/root/.ansible/tmp/ansible-tmp-1726853185.2673216-9009-188795829289428/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853185.2673216-9009-188795829289428/AnsiballZ_ping.py" <<< 7554 1726853185.32981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853185.33036: stderr chunk (state=3): >>><<< 7554 1726853185.33039: stdout chunk (state=3): >>><<< 7554 1726853185.33083: done transferring module to remote 7554 1726853185.33096: _low_level_execute_command(): starting 7554 1726853185.33099: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853185.2673216-9009-188795829289428/ /root/.ansible/tmp/ansible-tmp-1726853185.2673216-9009-188795829289428/AnsiballZ_ping.py && sleep 0' 7554 1726853185.33726: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853185.33730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853185.33733: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853185.33736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853185.33809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853185.33817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853185.33820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853185.33877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853185.35769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853185.35794: stderr chunk (state=3): >>><<< 7554 1726853185.35797: stdout chunk (state=3): >>><<< 7554 1726853185.35811: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853185.35814: _low_level_execute_command(): starting 7554 1726853185.35820: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853185.2673216-9009-188795829289428/AnsiballZ_ping.py && sleep 0' 7554 1726853185.36249: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853185.36252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853185.36255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853185.36257: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853185.36259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853185.36312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853185.36320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853185.36386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853185.51848: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7554 1726853185.53228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853185.53258: stderr chunk (state=3): >>><<< 7554 1726853185.53261: stdout chunk (state=3): >>><<< 7554 1726853185.53278: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853185.53300: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853185.2673216-9009-188795829289428/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853185.53306: _low_level_execute_command(): starting 7554 1726853185.53311: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853185.2673216-9009-188795829289428/ > /dev/null 2>&1 && sleep 0' 7554 1726853185.53768: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853185.53803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853185.53808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853185.53810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853185.53812: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853185.53814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853185.53816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853185.53872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853185.53879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853185.53881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853185.53936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853185.55817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853185.55840: stderr chunk (state=3): >>><<< 7554 1726853185.55843: stdout chunk (state=3): >>><<< 7554 1726853185.55860: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853185.55869: handler run complete 7554 1726853185.55881: attempt loop complete, returning result 7554 1726853185.55884: _execute() done 7554 1726853185.55886: dumping result to json 7554 1726853185.55889: done dumping result, returning 7554 1726853185.55897: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-bdc3-98b6-0000000000cd] 7554 1726853185.55904: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000cd 7554 1726853185.55990: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000cd 7554 1726853185.55992: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7554 1726853185.56075: no more pending results, returning what we have 7554 1726853185.56078: results queue empty 7554 1726853185.56079: checking for any_errors_fatal 7554 1726853185.56087: done checking for any_errors_fatal 7554 1726853185.56087: checking for max_fail_percentage 7554 1726853185.56089: done checking for max_fail_percentage 7554 1726853185.56090: checking to see if all hosts have failed and the running result is not ok 7554 1726853185.56091: done checking to see if all hosts have failed 7554 1726853185.56091: getting the remaining hosts for this loop 7554 1726853185.56093: done getting the remaining hosts for this loop 7554 1726853185.56096: getting the next task for host managed_node3 7554 1726853185.56105: done getting next task for host managed_node3 7554 1726853185.56107: ^ task is: TASK: meta (role_complete) 7554 1726853185.56110: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853185.56124: getting variables 7554 1726853185.56126: in VariableManager get_vars() 7554 1726853185.56175: Calling all_inventory to load vars for managed_node3 7554 1726853185.56178: Calling groups_inventory to load vars for managed_node3 7554 1726853185.56180: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853185.56189: Calling all_plugins_play to load vars for managed_node3 7554 1726853185.56192: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853185.56194: Calling groups_plugins_play to load vars for managed_node3 7554 1726853185.57105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853185.58516: done with get_vars() 7554 1726853185.58541: done getting variables 7554 1726853185.58625: done queuing things up, now waiting for results queue to drain 7554 1726853185.58627: results queue empty 7554 1726853185.58628: checking for any_errors_fatal 7554 1726853185.58631: done checking for any_errors_fatal 7554 1726853185.58631: checking for max_fail_percentage 7554 1726853185.58633: done checking for max_fail_percentage 7554 1726853185.58633: checking to see if all hosts have failed and the running result is not ok 7554 1726853185.58634: done checking to see if all hosts have failed 7554 1726853185.58635: getting the remaining hosts for this loop 7554 1726853185.58636: done getting the remaining hosts for this loop 7554 1726853185.58639: getting the next task for host managed_node3 7554 1726853185.58642: done getting next task for host managed_node3 7554 1726853185.58644: ^ task is: TASK: Include the task 'assert_device_present.yml' 7554 1726853185.58646: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853185.58648: getting variables 7554 1726853185.58649: in VariableManager get_vars() 7554 1726853185.58667: Calling all_inventory to load vars for managed_node3 7554 1726853185.58669: Calling groups_inventory to load vars for managed_node3 7554 1726853185.58675: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853185.58680: Calling all_plugins_play to load vars for managed_node3 7554 1726853185.58682: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853185.58685: Calling groups_plugins_play to load vars for managed_node3 7554 1726853185.59778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853185.61311: done with get_vars() 7554 1726853185.61333: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:108 Friday 20 September 2024 13:26:25 -0400 (0:00:00.395) 0:00:39.581 ****** 7554 1726853185.61407: entering _queue_task() for managed_node3/include_tasks 7554 1726853185.61745: worker is 1 (out of 1 available) 7554 1726853185.61757: exiting _queue_task() for managed_node3/include_tasks 7554 1726853185.61769: done queuing things up, now waiting for results queue to drain 7554 1726853185.61975: waiting for pending results... 7554 1726853185.62105: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 7554 1726853185.62166: in run() - task 02083763-bbaf-bdc3-98b6-0000000000fd 7554 1726853185.62190: variable 'ansible_search_path' from source: unknown 7554 1726853185.62236: calling self._execute() 7554 1726853185.62339: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853185.62352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853185.62417: variable 'omit' from source: magic vars 7554 1726853185.62764: variable 'ansible_distribution_major_version' from source: facts 7554 1726853185.62783: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853185.62793: _execute() done 7554 1726853185.62801: dumping result to json 7554 1726853185.62808: done dumping result, returning 7554 1726853185.62817: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [02083763-bbaf-bdc3-98b6-0000000000fd] 7554 1726853185.62826: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000fd 7554 1726853185.62982: no more pending results, returning what we have 7554 1726853185.62987: in VariableManager get_vars() 7554 1726853185.63047: Calling all_inventory to load vars for managed_node3 7554 1726853185.63049: Calling groups_inventory to load vars for managed_node3 7554 1726853185.63052: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853185.63066: Calling all_plugins_play to load vars for managed_node3 7554 1726853185.63069: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853185.63074: Calling groups_plugins_play to load vars for managed_node3 7554 1726853185.63784: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000fd 7554 1726853185.63788: WORKER PROCESS EXITING 7554 1726853185.64741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853185.66218: done with get_vars() 7554 1726853185.66243: variable 'ansible_search_path' from source: unknown 7554 1726853185.66259: we have included files to process 7554 1726853185.66260: generating all_blocks data 7554 1726853185.66263: done generating all_blocks data 7554 1726853185.66268: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7554 1726853185.66270: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7554 1726853185.66275: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7554 1726853185.66384: in VariableManager get_vars() 7554 1726853185.66414: done with get_vars() 7554 1726853185.66524: done processing included file 7554 1726853185.66526: iterating over new_blocks loaded from include file 7554 1726853185.66528: in VariableManager get_vars() 7554 1726853185.66549: done with get_vars() 7554 1726853185.66551: filtering new block on tags 7554 1726853185.66569: done filtering new block on tags 7554 1726853185.66573: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 7554 1726853185.66579: extending task lists for all hosts with included blocks 7554 1726853185.71678: done extending task lists 7554 1726853185.71680: done processing included files 7554 1726853185.71681: results queue empty 7554 1726853185.71681: checking for any_errors_fatal 7554 1726853185.71683: done checking for any_errors_fatal 7554 1726853185.71683: checking for max_fail_percentage 7554 1726853185.71685: done checking for max_fail_percentage 7554 1726853185.71685: checking to see if all hosts have failed and the running result is not ok 7554 1726853185.71687: done checking to see if all hosts have failed 7554 1726853185.71687: getting the remaining hosts for this loop 7554 1726853185.71688: done getting the remaining hosts for this loop 7554 1726853185.71691: getting the next task for host managed_node3 7554 1726853185.71695: done getting next task for host managed_node3 7554 1726853185.71697: ^ task is: TASK: Include the task 'get_interface_stat.yml' 7554 1726853185.71700: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853185.71702: getting variables 7554 1726853185.71703: in VariableManager get_vars() 7554 1726853185.71725: Calling all_inventory to load vars for managed_node3 7554 1726853185.71728: Calling groups_inventory to load vars for managed_node3 7554 1726853185.71730: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853185.71736: Calling all_plugins_play to load vars for managed_node3 7554 1726853185.71738: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853185.71741: Calling groups_plugins_play to load vars for managed_node3 7554 1726853185.78051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853185.79505: done with get_vars() 7554 1726853185.79532: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:26:25 -0400 (0:00:00.181) 0:00:39.763 ****** 7554 1726853185.79605: entering _queue_task() for managed_node3/include_tasks 7554 1726853185.79958: worker is 1 (out of 1 available) 7554 1726853185.79974: exiting _queue_task() for managed_node3/include_tasks 7554 1726853185.79986: done queuing things up, now waiting for results queue to drain 7554 1726853185.79988: waiting for pending results... 7554 1726853185.80203: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 7554 1726853185.80290: in run() - task 02083763-bbaf-bdc3-98b6-00000000143a 7554 1726853185.80301: variable 'ansible_search_path' from source: unknown 7554 1726853185.80305: variable 'ansible_search_path' from source: unknown 7554 1726853185.80334: calling self._execute() 7554 1726853185.80408: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853185.80412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853185.80422: variable 'omit' from source: magic vars 7554 1726853185.80720: variable 'ansible_distribution_major_version' from source: facts 7554 1726853185.80730: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853185.80735: _execute() done 7554 1726853185.80738: dumping result to json 7554 1726853185.80740: done dumping result, returning 7554 1726853185.80747: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-bdc3-98b6-00000000143a] 7554 1726853185.80754: sending task result for task 02083763-bbaf-bdc3-98b6-00000000143a 7554 1726853185.80839: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000143a 7554 1726853185.80844: WORKER PROCESS EXITING 7554 1726853185.80889: no more pending results, returning what we have 7554 1726853185.80894: in VariableManager get_vars() 7554 1726853185.80955: Calling all_inventory to load vars for managed_node3 7554 1726853185.80958: Calling groups_inventory to load vars for managed_node3 7554 1726853185.80960: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853185.80973: Calling all_plugins_play to load vars for managed_node3 7554 1726853185.80976: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853185.80979: Calling groups_plugins_play to load vars for managed_node3 7554 1726853185.81752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853185.83086: done with get_vars() 7554 1726853185.83101: variable 'ansible_search_path' from source: unknown 7554 1726853185.83102: variable 'ansible_search_path' from source: unknown 7554 1726853185.83127: we have included files to process 7554 1726853185.83128: generating all_blocks data 7554 1726853185.83129: done generating all_blocks data 7554 1726853185.83130: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7554 1726853185.83131: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7554 1726853185.83132: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7554 1726853185.83264: done processing included file 7554 1726853185.83266: iterating over new_blocks loaded from include file 7554 1726853185.83267: in VariableManager get_vars() 7554 1726853185.83287: done with get_vars() 7554 1726853185.83288: filtering new block on tags 7554 1726853185.83298: done filtering new block on tags 7554 1726853185.83300: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 7554 1726853185.83304: extending task lists for all hosts with included blocks 7554 1726853185.83364: done extending task lists 7554 1726853185.83365: done processing included files 7554 1726853185.83366: results queue empty 7554 1726853185.83367: checking for any_errors_fatal 7554 1726853185.83370: done checking for any_errors_fatal 7554 1726853185.83370: checking for max_fail_percentage 7554 1726853185.83372: done checking for max_fail_percentage 7554 1726853185.83373: checking to see if all hosts have failed and the running result is not ok 7554 1726853185.83374: done checking to see if all hosts have failed 7554 1726853185.83374: getting the remaining hosts for this loop 7554 1726853185.83375: done getting the remaining hosts for this loop 7554 1726853185.83376: getting the next task for host managed_node3 7554 1726853185.83379: done getting next task for host managed_node3 7554 1726853185.83381: ^ task is: TASK: Get stat for interface {{ interface }} 7554 1726853185.83384: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853185.83386: getting variables 7554 1726853185.83386: in VariableManager get_vars() 7554 1726853185.83397: Calling all_inventory to load vars for managed_node3 7554 1726853185.83399: Calling groups_inventory to load vars for managed_node3 7554 1726853185.83400: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853185.83404: Calling all_plugins_play to load vars for managed_node3 7554 1726853185.83405: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853185.83407: Calling groups_plugins_play to load vars for managed_node3 7554 1726853185.84089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853185.84930: done with get_vars() 7554 1726853185.84947: done getting variables 7554 1726853185.85066: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:26:25 -0400 (0:00:00.054) 0:00:39.818 ****** 7554 1726853185.85092: entering _queue_task() for managed_node3/stat 7554 1726853185.85426: worker is 1 (out of 1 available) 7554 1726853185.85440: exiting _queue_task() for managed_node3/stat 7554 1726853185.85451: done queuing things up, now waiting for results queue to drain 7554 1726853185.85453: waiting for pending results... 7554 1726853185.85889: running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 7554 1726853185.85895: in run() - task 02083763-bbaf-bdc3-98b6-0000000016ba 7554 1726853185.85912: variable 'ansible_search_path' from source: unknown 7554 1726853185.85923: variable 'ansible_search_path' from source: unknown 7554 1726853185.86078: calling self._execute() 7554 1726853185.86082: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853185.86088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853185.86091: variable 'omit' from source: magic vars 7554 1726853185.86506: variable 'ansible_distribution_major_version' from source: facts 7554 1726853185.86520: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853185.86525: variable 'omit' from source: magic vars 7554 1726853185.86592: variable 'omit' from source: magic vars 7554 1726853185.86724: variable 'interface' from source: play vars 7554 1726853185.86738: variable 'omit' from source: magic vars 7554 1726853185.86773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853185.86807: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853185.86841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853185.86847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853185.86864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853185.86897: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853185.86901: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853185.86903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853185.86972: Set connection var ansible_shell_executable to /bin/sh 7554 1726853185.86979: Set connection var ansible_pipelining to False 7554 1726853185.86983: Set connection var ansible_shell_type to sh 7554 1726853185.86986: Set connection var ansible_connection to ssh 7554 1726853185.86993: Set connection var ansible_timeout to 10 7554 1726853185.86999: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853185.87022: variable 'ansible_shell_executable' from source: unknown 7554 1726853185.87026: variable 'ansible_connection' from source: unknown 7554 1726853185.87029: variable 'ansible_module_compression' from source: unknown 7554 1726853185.87031: variable 'ansible_shell_type' from source: unknown 7554 1726853185.87033: variable 'ansible_shell_executable' from source: unknown 7554 1726853185.87036: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853185.87038: variable 'ansible_pipelining' from source: unknown 7554 1726853185.87040: variable 'ansible_timeout' from source: unknown 7554 1726853185.87045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853185.87195: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853185.87204: variable 'omit' from source: magic vars 7554 1726853185.87209: starting attempt loop 7554 1726853185.87211: running the handler 7554 1726853185.87227: _low_level_execute_command(): starting 7554 1726853185.87234: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853185.87738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853185.87745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853185.87749: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853185.87753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853185.87802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853185.87806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853185.87809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853185.87882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853185.89605: stdout chunk (state=3): >>>/root <<< 7554 1726853185.89753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853185.89756: stdout chunk (state=3): >>><<< 7554 1726853185.89759: stderr chunk (state=3): >>><<< 7554 1726853185.89783: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853185.89804: _low_level_execute_command(): starting 7554 1726853185.89877: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853185.8979096-9030-135675202134975 `" && echo ansible-tmp-1726853185.8979096-9030-135675202134975="` echo /root/.ansible/tmp/ansible-tmp-1726853185.8979096-9030-135675202134975 `" ) && sleep 0' 7554 1726853185.90469: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853185.90493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853185.90536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853185.90578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853185.90590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853185.90659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853185.92654: stdout chunk (state=3): >>>ansible-tmp-1726853185.8979096-9030-135675202134975=/root/.ansible/tmp/ansible-tmp-1726853185.8979096-9030-135675202134975 <<< 7554 1726853185.92792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853185.92812: stderr chunk (state=3): >>><<< 7554 1726853185.92825: stdout chunk (state=3): >>><<< 7554 1726853185.93028: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853185.8979096-9030-135675202134975=/root/.ansible/tmp/ansible-tmp-1726853185.8979096-9030-135675202134975 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853185.93031: variable 'ansible_module_compression' from source: unknown 7554 1726853185.93033: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7554 1726853185.93035: variable 'ansible_facts' from source: unknown 7554 1726853185.93093: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853185.8979096-9030-135675202134975/AnsiballZ_stat.py 7554 1726853185.93214: Sending initial data 7554 1726853185.93218: Sent initial data (151 bytes) 7554 1726853185.93645: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853185.93648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853185.93650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853185.93653: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853185.93655: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853185.93702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853185.93705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853185.93770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853185.95388: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7554 1726853185.95395: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853185.95448: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853185.95507: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpqjf4mezi /root/.ansible/tmp/ansible-tmp-1726853185.8979096-9030-135675202134975/AnsiballZ_stat.py <<< 7554 1726853185.95510: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853185.8979096-9030-135675202134975/AnsiballZ_stat.py" <<< 7554 1726853185.95561: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpqjf4mezi" to remote "/root/.ansible/tmp/ansible-tmp-1726853185.8979096-9030-135675202134975/AnsiballZ_stat.py" <<< 7554 1726853185.95568: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853185.8979096-9030-135675202134975/AnsiballZ_stat.py" <<< 7554 1726853185.96177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853185.96216: stderr chunk (state=3): >>><<< 7554 1726853185.96221: stdout chunk (state=3): >>><<< 7554 1726853185.96244: done transferring module to remote 7554 1726853185.96254: _low_level_execute_command(): starting 7554 1726853185.96259: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853185.8979096-9030-135675202134975/ /root/.ansible/tmp/ansible-tmp-1726853185.8979096-9030-135675202134975/AnsiballZ_stat.py && sleep 0' 7554 1726853185.96665: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853185.96674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853185.96698: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853185.96700: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853185.96703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853185.96756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853185.96766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853185.96822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853185.98651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853185.98680: stderr chunk (state=3): >>><<< 7554 1726853185.98683: stdout chunk (state=3): >>><<< 7554 1726853185.98696: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853185.98698: _low_level_execute_command(): starting 7554 1726853185.98703: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853185.8979096-9030-135675202134975/AnsiballZ_stat.py && sleep 0' 7554 1726853185.99119: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853185.99123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853185.99125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853185.99127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853185.99175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853185.99178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853185.99248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853186.14877: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25529, "dev": 23, "nlink": 1, "atime": 1726853177.4202847, "mtime": 1726853177.4202847, "ctime": 1726853177.4202847, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7554 1726853186.16211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853186.16233: stderr chunk (state=3): >>><<< 7554 1726853186.16236: stdout chunk (state=3): >>><<< 7554 1726853186.16261: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25529, "dev": 23, "nlink": 1, "atime": 1726853177.4202847, "mtime": 1726853177.4202847, "ctime": 1726853177.4202847, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853186.16299: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853185.8979096-9030-135675202134975/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853186.16306: _low_level_execute_command(): starting 7554 1726853186.16311: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853185.8979096-9030-135675202134975/ > /dev/null 2>&1 && sleep 0' 7554 1726853186.16738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853186.16749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853186.16752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853186.16754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853186.16756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853186.16802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853186.16807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853186.16810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853186.16866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853186.18739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853186.18766: stderr chunk (state=3): >>><<< 7554 1726853186.18769: stdout chunk (state=3): >>><<< 7554 1726853186.18783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853186.18791: handler run complete 7554 1726853186.18819: attempt loop complete, returning result 7554 1726853186.18822: _execute() done 7554 1726853186.18824: dumping result to json 7554 1726853186.18829: done dumping result, returning 7554 1726853186.18836: done running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 [02083763-bbaf-bdc3-98b6-0000000016ba] 7554 1726853186.18844: sending task result for task 02083763-bbaf-bdc3-98b6-0000000016ba 7554 1726853186.18950: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000016ba 7554 1726853186.18952: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726853177.4202847, "block_size": 4096, "blocks": 0, "ctime": 1726853177.4202847, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 25529, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1726853177.4202847, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 7554 1726853186.19061: no more pending results, returning what we have 7554 1726853186.19064: results queue empty 7554 1726853186.19064: checking for any_errors_fatal 7554 1726853186.19066: done checking for any_errors_fatal 7554 1726853186.19067: checking for max_fail_percentage 7554 1726853186.19068: done checking for max_fail_percentage 7554 1726853186.19069: checking to see if all hosts have failed and the running result is not ok 7554 1726853186.19070: done checking to see if all hosts have failed 7554 1726853186.19072: getting the remaining hosts for this loop 7554 1726853186.19074: done getting the remaining hosts for this loop 7554 1726853186.19077: getting the next task for host managed_node3 7554 1726853186.19085: done getting next task for host managed_node3 7554 1726853186.19088: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 7554 1726853186.19090: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853186.19094: getting variables 7554 1726853186.19096: in VariableManager get_vars() 7554 1726853186.19137: Calling all_inventory to load vars for managed_node3 7554 1726853186.19139: Calling groups_inventory to load vars for managed_node3 7554 1726853186.19141: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853186.19154: Calling all_plugins_play to load vars for managed_node3 7554 1726853186.19156: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853186.19159: Calling groups_plugins_play to load vars for managed_node3 7554 1726853186.20092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853186.21513: done with get_vars() 7554 1726853186.21538: done getting variables 7554 1726853186.21603: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853186.21722: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:26:26 -0400 (0:00:00.366) 0:00:40.185 ****** 7554 1726853186.21756: entering _queue_task() for managed_node3/assert 7554 1726853186.22011: worker is 1 (out of 1 available) 7554 1726853186.22024: exiting _queue_task() for managed_node3/assert 7554 1726853186.22035: done queuing things up, now waiting for results queue to drain 7554 1726853186.22037: waiting for pending results... 7554 1726853186.22218: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' 7554 1726853186.22300: in run() - task 02083763-bbaf-bdc3-98b6-00000000143b 7554 1726853186.22309: variable 'ansible_search_path' from source: unknown 7554 1726853186.22312: variable 'ansible_search_path' from source: unknown 7554 1726853186.22341: calling self._execute() 7554 1726853186.22417: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853186.22424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853186.22433: variable 'omit' from source: magic vars 7554 1726853186.22712: variable 'ansible_distribution_major_version' from source: facts 7554 1726853186.22722: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853186.22733: variable 'omit' from source: magic vars 7554 1726853186.22759: variable 'omit' from source: magic vars 7554 1726853186.22830: variable 'interface' from source: play vars 7554 1726853186.22848: variable 'omit' from source: magic vars 7554 1726853186.22882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853186.22910: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853186.22926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853186.22940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853186.22953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853186.22977: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853186.22981: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853186.22984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853186.23055: Set connection var ansible_shell_executable to /bin/sh 7554 1726853186.23063: Set connection var ansible_pipelining to False 7554 1726853186.23066: Set connection var ansible_shell_type to sh 7554 1726853186.23069: Set connection var ansible_connection to ssh 7554 1726853186.23076: Set connection var ansible_timeout to 10 7554 1726853186.23081: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853186.23099: variable 'ansible_shell_executable' from source: unknown 7554 1726853186.23102: variable 'ansible_connection' from source: unknown 7554 1726853186.23105: variable 'ansible_module_compression' from source: unknown 7554 1726853186.23107: variable 'ansible_shell_type' from source: unknown 7554 1726853186.23110: variable 'ansible_shell_executable' from source: unknown 7554 1726853186.23112: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853186.23115: variable 'ansible_pipelining' from source: unknown 7554 1726853186.23117: variable 'ansible_timeout' from source: unknown 7554 1726853186.23119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853186.23223: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853186.23231: variable 'omit' from source: magic vars 7554 1726853186.23238: starting attempt loop 7554 1726853186.23241: running the handler 7554 1726853186.23330: variable 'interface_stat' from source: set_fact 7554 1726853186.23348: Evaluated conditional (interface_stat.stat.exists): True 7554 1726853186.23351: handler run complete 7554 1726853186.23362: attempt loop complete, returning result 7554 1726853186.23365: _execute() done 7554 1726853186.23368: dumping result to json 7554 1726853186.23372: done dumping result, returning 7554 1726853186.23377: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' [02083763-bbaf-bdc3-98b6-00000000143b] 7554 1726853186.23388: sending task result for task 02083763-bbaf-bdc3-98b6-00000000143b 7554 1726853186.23466: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000143b 7554 1726853186.23469: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7554 1726853186.23515: no more pending results, returning what we have 7554 1726853186.23518: results queue empty 7554 1726853186.23519: checking for any_errors_fatal 7554 1726853186.23528: done checking for any_errors_fatal 7554 1726853186.23529: checking for max_fail_percentage 7554 1726853186.23530: done checking for max_fail_percentage 7554 1726853186.23531: checking to see if all hosts have failed and the running result is not ok 7554 1726853186.23532: done checking to see if all hosts have failed 7554 1726853186.23533: getting the remaining hosts for this loop 7554 1726853186.23534: done getting the remaining hosts for this loop 7554 1726853186.23538: getting the next task for host managed_node3 7554 1726853186.23548: done getting next task for host managed_node3 7554 1726853186.23551: ^ task is: TASK: Include the task 'assert_profile_present.yml' 7554 1726853186.23553: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853186.23557: getting variables 7554 1726853186.23558: in VariableManager get_vars() 7554 1726853186.23604: Calling all_inventory to load vars for managed_node3 7554 1726853186.23607: Calling groups_inventory to load vars for managed_node3 7554 1726853186.23609: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853186.23619: Calling all_plugins_play to load vars for managed_node3 7554 1726853186.23622: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853186.23624: Calling groups_plugins_play to load vars for managed_node3 7554 1726853186.24946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853186.25804: done with get_vars() 7554 1726853186.25820: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:110 Friday 20 September 2024 13:26:26 -0400 (0:00:00.041) 0:00:40.226 ****** 7554 1726853186.25887: entering _queue_task() for managed_node3/include_tasks 7554 1726853186.26109: worker is 1 (out of 1 available) 7554 1726853186.26123: exiting _queue_task() for managed_node3/include_tasks 7554 1726853186.26135: done queuing things up, now waiting for results queue to drain 7554 1726853186.26137: waiting for pending results... 7554 1726853186.26324: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 7554 1726853186.26399: in run() - task 02083763-bbaf-bdc3-98b6-0000000000fe 7554 1726853186.26411: variable 'ansible_search_path' from source: unknown 7554 1726853186.26439: calling self._execute() 7554 1726853186.26522: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853186.26527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853186.26536: variable 'omit' from source: magic vars 7554 1726853186.26833: variable 'ansible_distribution_major_version' from source: facts 7554 1726853186.26842: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853186.26850: _execute() done 7554 1726853186.26853: dumping result to json 7554 1726853186.26857: done dumping result, returning 7554 1726853186.26865: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [02083763-bbaf-bdc3-98b6-0000000000fe] 7554 1726853186.26872: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000fe 7554 1726853186.26956: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000fe 7554 1726853186.26959: WORKER PROCESS EXITING 7554 1726853186.26986: no more pending results, returning what we have 7554 1726853186.26991: in VariableManager get_vars() 7554 1726853186.27043: Calling all_inventory to load vars for managed_node3 7554 1726853186.27046: Calling groups_inventory to load vars for managed_node3 7554 1726853186.27048: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853186.27059: Calling all_plugins_play to load vars for managed_node3 7554 1726853186.27062: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853186.27065: Calling groups_plugins_play to load vars for managed_node3 7554 1726853186.27831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853186.28782: done with get_vars() 7554 1726853186.28796: variable 'ansible_search_path' from source: unknown 7554 1726853186.28806: we have included files to process 7554 1726853186.28807: generating all_blocks data 7554 1726853186.28808: done generating all_blocks data 7554 1726853186.28811: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7554 1726853186.28811: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7554 1726853186.28813: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7554 1726853186.28892: in VariableManager get_vars() 7554 1726853186.28913: done with get_vars() 7554 1726853186.29087: done processing included file 7554 1726853186.29088: iterating over new_blocks loaded from include file 7554 1726853186.29089: in VariableManager get_vars() 7554 1726853186.29104: done with get_vars() 7554 1726853186.29105: filtering new block on tags 7554 1726853186.29119: done filtering new block on tags 7554 1726853186.29121: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 7554 1726853186.29125: extending task lists for all hosts with included blocks 7554 1726853186.32147: done extending task lists 7554 1726853186.32149: done processing included files 7554 1726853186.32150: results queue empty 7554 1726853186.32150: checking for any_errors_fatal 7554 1726853186.32153: done checking for any_errors_fatal 7554 1726853186.32154: checking for max_fail_percentage 7554 1726853186.32155: done checking for max_fail_percentage 7554 1726853186.32156: checking to see if all hosts have failed and the running result is not ok 7554 1726853186.32157: done checking to see if all hosts have failed 7554 1726853186.32157: getting the remaining hosts for this loop 7554 1726853186.32158: done getting the remaining hosts for this loop 7554 1726853186.32160: getting the next task for host managed_node3 7554 1726853186.32162: done getting next task for host managed_node3 7554 1726853186.32164: ^ task is: TASK: Include the task 'get_profile_stat.yml' 7554 1726853186.32166: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853186.32167: getting variables 7554 1726853186.32168: in VariableManager get_vars() 7554 1726853186.32185: Calling all_inventory to load vars for managed_node3 7554 1726853186.32186: Calling groups_inventory to load vars for managed_node3 7554 1726853186.32187: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853186.32194: Calling all_plugins_play to load vars for managed_node3 7554 1726853186.32195: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853186.32197: Calling groups_plugins_play to load vars for managed_node3 7554 1726853186.32854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853186.33841: done with get_vars() 7554 1726853186.33858: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:26:26 -0400 (0:00:00.080) 0:00:40.306 ****** 7554 1726853186.33918: entering _queue_task() for managed_node3/include_tasks 7554 1726853186.34187: worker is 1 (out of 1 available) 7554 1726853186.34201: exiting _queue_task() for managed_node3/include_tasks 7554 1726853186.34213: done queuing things up, now waiting for results queue to drain 7554 1726853186.34215: waiting for pending results... 7554 1726853186.34402: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 7554 1726853186.34482: in run() - task 02083763-bbaf-bdc3-98b6-0000000016d2 7554 1726853186.34494: variable 'ansible_search_path' from source: unknown 7554 1726853186.34499: variable 'ansible_search_path' from source: unknown 7554 1726853186.34526: calling self._execute() 7554 1726853186.34604: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853186.34610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853186.34618: variable 'omit' from source: magic vars 7554 1726853186.34910: variable 'ansible_distribution_major_version' from source: facts 7554 1726853186.34920: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853186.34925: _execute() done 7554 1726853186.34928: dumping result to json 7554 1726853186.34932: done dumping result, returning 7554 1726853186.34937: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-bdc3-98b6-0000000016d2] 7554 1726853186.34947: sending task result for task 02083763-bbaf-bdc3-98b6-0000000016d2 7554 1726853186.35030: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000016d2 7554 1726853186.35033: WORKER PROCESS EXITING 7554 1726853186.35060: no more pending results, returning what we have 7554 1726853186.35066: in VariableManager get_vars() 7554 1726853186.35124: Calling all_inventory to load vars for managed_node3 7554 1726853186.35126: Calling groups_inventory to load vars for managed_node3 7554 1726853186.35129: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853186.35141: Calling all_plugins_play to load vars for managed_node3 7554 1726853186.35145: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853186.35148: Calling groups_plugins_play to load vars for managed_node3 7554 1726853186.35936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853186.36792: done with get_vars() 7554 1726853186.36807: variable 'ansible_search_path' from source: unknown 7554 1726853186.36808: variable 'ansible_search_path' from source: unknown 7554 1726853186.36833: we have included files to process 7554 1726853186.36833: generating all_blocks data 7554 1726853186.36835: done generating all_blocks data 7554 1726853186.36835: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7554 1726853186.36836: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7554 1726853186.36838: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7554 1726853186.37417: done processing included file 7554 1726853186.37418: iterating over new_blocks loaded from include file 7554 1726853186.37420: in VariableManager get_vars() 7554 1726853186.37436: done with get_vars() 7554 1726853186.37437: filtering new block on tags 7554 1726853186.37452: done filtering new block on tags 7554 1726853186.37455: in VariableManager get_vars() 7554 1726853186.37470: done with get_vars() 7554 1726853186.37472: filtering new block on tags 7554 1726853186.37485: done filtering new block on tags 7554 1726853186.37487: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 7554 1726853186.37490: extending task lists for all hosts with included blocks 7554 1726853186.37587: done extending task lists 7554 1726853186.37588: done processing included files 7554 1726853186.37589: results queue empty 7554 1726853186.37589: checking for any_errors_fatal 7554 1726853186.37591: done checking for any_errors_fatal 7554 1726853186.37591: checking for max_fail_percentage 7554 1726853186.37592: done checking for max_fail_percentage 7554 1726853186.37593: checking to see if all hosts have failed and the running result is not ok 7554 1726853186.37593: done checking to see if all hosts have failed 7554 1726853186.37594: getting the remaining hosts for this loop 7554 1726853186.37595: done getting the remaining hosts for this loop 7554 1726853186.37596: getting the next task for host managed_node3 7554 1726853186.37599: done getting next task for host managed_node3 7554 1726853186.37600: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 7554 1726853186.37602: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853186.37604: getting variables 7554 1726853186.37604: in VariableManager get_vars() 7554 1726853186.37615: Calling all_inventory to load vars for managed_node3 7554 1726853186.37617: Calling groups_inventory to load vars for managed_node3 7554 1726853186.37618: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853186.37622: Calling all_plugins_play to load vars for managed_node3 7554 1726853186.37623: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853186.37625: Calling groups_plugins_play to load vars for managed_node3 7554 1726853186.38293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853186.39120: done with get_vars() 7554 1726853186.39133: done getting variables 7554 1726853186.39160: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:26:26 -0400 (0:00:00.052) 0:00:40.359 ****** 7554 1726853186.39183: entering _queue_task() for managed_node3/set_fact 7554 1726853186.39432: worker is 1 (out of 1 available) 7554 1726853186.39447: exiting _queue_task() for managed_node3/set_fact 7554 1726853186.39460: done queuing things up, now waiting for results queue to drain 7554 1726853186.39462: waiting for pending results... 7554 1726853186.39645: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 7554 1726853186.39723: in run() - task 02083763-bbaf-bdc3-98b6-00000000195f 7554 1726853186.39735: variable 'ansible_search_path' from source: unknown 7554 1726853186.39739: variable 'ansible_search_path' from source: unknown 7554 1726853186.39767: calling self._execute() 7554 1726853186.39846: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853186.39853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853186.39862: variable 'omit' from source: magic vars 7554 1726853186.40142: variable 'ansible_distribution_major_version' from source: facts 7554 1726853186.40154: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853186.40160: variable 'omit' from source: magic vars 7554 1726853186.40193: variable 'omit' from source: magic vars 7554 1726853186.40218: variable 'omit' from source: magic vars 7554 1726853186.40254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853186.40282: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853186.40300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853186.40314: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853186.40323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853186.40353: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853186.40357: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853186.40360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853186.40426: Set connection var ansible_shell_executable to /bin/sh 7554 1726853186.40433: Set connection var ansible_pipelining to False 7554 1726853186.40435: Set connection var ansible_shell_type to sh 7554 1726853186.40438: Set connection var ansible_connection to ssh 7554 1726853186.40451: Set connection var ansible_timeout to 10 7554 1726853186.40454: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853186.40474: variable 'ansible_shell_executable' from source: unknown 7554 1726853186.40477: variable 'ansible_connection' from source: unknown 7554 1726853186.40480: variable 'ansible_module_compression' from source: unknown 7554 1726853186.40483: variable 'ansible_shell_type' from source: unknown 7554 1726853186.40485: variable 'ansible_shell_executable' from source: unknown 7554 1726853186.40487: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853186.40490: variable 'ansible_pipelining' from source: unknown 7554 1726853186.40492: variable 'ansible_timeout' from source: unknown 7554 1726853186.40496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853186.40599: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853186.40608: variable 'omit' from source: magic vars 7554 1726853186.40613: starting attempt loop 7554 1726853186.40616: running the handler 7554 1726853186.40627: handler run complete 7554 1726853186.40635: attempt loop complete, returning result 7554 1726853186.40638: _execute() done 7554 1726853186.40640: dumping result to json 7554 1726853186.40643: done dumping result, returning 7554 1726853186.40652: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-bdc3-98b6-00000000195f] 7554 1726853186.40658: sending task result for task 02083763-bbaf-bdc3-98b6-00000000195f 7554 1726853186.40738: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000195f 7554 1726853186.40740: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 7554 1726853186.40824: no more pending results, returning what we have 7554 1726853186.40826: results queue empty 7554 1726853186.40827: checking for any_errors_fatal 7554 1726853186.40829: done checking for any_errors_fatal 7554 1726853186.40830: checking for max_fail_percentage 7554 1726853186.40831: done checking for max_fail_percentage 7554 1726853186.40832: checking to see if all hosts have failed and the running result is not ok 7554 1726853186.40833: done checking to see if all hosts have failed 7554 1726853186.40834: getting the remaining hosts for this loop 7554 1726853186.40835: done getting the remaining hosts for this loop 7554 1726853186.40838: getting the next task for host managed_node3 7554 1726853186.40844: done getting next task for host managed_node3 7554 1726853186.40846: ^ task is: TASK: Stat profile file 7554 1726853186.40849: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853186.40853: getting variables 7554 1726853186.40855: in VariableManager get_vars() 7554 1726853186.40900: Calling all_inventory to load vars for managed_node3 7554 1726853186.40902: Calling groups_inventory to load vars for managed_node3 7554 1726853186.40904: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853186.40914: Calling all_plugins_play to load vars for managed_node3 7554 1726853186.40916: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853186.40918: Calling groups_plugins_play to load vars for managed_node3 7554 1726853186.41683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853186.42565: done with get_vars() 7554 1726853186.42583: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:26:26 -0400 (0:00:00.034) 0:00:40.394 ****** 7554 1726853186.42652: entering _queue_task() for managed_node3/stat 7554 1726853186.42897: worker is 1 (out of 1 available) 7554 1726853186.42912: exiting _queue_task() for managed_node3/stat 7554 1726853186.42925: done queuing things up, now waiting for results queue to drain 7554 1726853186.42926: waiting for pending results... 7554 1726853186.43103: running TaskExecutor() for managed_node3/TASK: Stat profile file 7554 1726853186.43172: in run() - task 02083763-bbaf-bdc3-98b6-000000001960 7554 1726853186.43185: variable 'ansible_search_path' from source: unknown 7554 1726853186.43189: variable 'ansible_search_path' from source: unknown 7554 1726853186.43215: calling self._execute() 7554 1726853186.43294: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853186.43298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853186.43310: variable 'omit' from source: magic vars 7554 1726853186.43590: variable 'ansible_distribution_major_version' from source: facts 7554 1726853186.43599: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853186.43605: variable 'omit' from source: magic vars 7554 1726853186.43634: variable 'omit' from source: magic vars 7554 1726853186.43706: variable 'profile' from source: include params 7554 1726853186.43710: variable 'interface' from source: play vars 7554 1726853186.43755: variable 'interface' from source: play vars 7554 1726853186.43769: variable 'omit' from source: magic vars 7554 1726853186.43806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853186.43834: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853186.43853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853186.43866: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853186.43878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853186.43901: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853186.43904: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853186.43909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853186.43978: Set connection var ansible_shell_executable to /bin/sh 7554 1726853186.43985: Set connection var ansible_pipelining to False 7554 1726853186.43988: Set connection var ansible_shell_type to sh 7554 1726853186.43990: Set connection var ansible_connection to ssh 7554 1726853186.43997: Set connection var ansible_timeout to 10 7554 1726853186.44002: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853186.44027: variable 'ansible_shell_executable' from source: unknown 7554 1726853186.44030: variable 'ansible_connection' from source: unknown 7554 1726853186.44033: variable 'ansible_module_compression' from source: unknown 7554 1726853186.44035: variable 'ansible_shell_type' from source: unknown 7554 1726853186.44037: variable 'ansible_shell_executable' from source: unknown 7554 1726853186.44039: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853186.44041: variable 'ansible_pipelining' from source: unknown 7554 1726853186.44047: variable 'ansible_timeout' from source: unknown 7554 1726853186.44049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853186.44284: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853186.44288: variable 'omit' from source: magic vars 7554 1726853186.44291: starting attempt loop 7554 1726853186.44293: running the handler 7554 1726853186.44296: _low_level_execute_command(): starting 7554 1726853186.44298: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853186.44940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853186.44946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853186.45058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853186.45074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853186.45092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853186.45190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853186.46883: stdout chunk (state=3): >>>/root <<< 7554 1726853186.46987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853186.47013: stderr chunk (state=3): >>><<< 7554 1726853186.47016: stdout chunk (state=3): >>><<< 7554 1726853186.47038: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853186.47051: _low_level_execute_command(): starting 7554 1726853186.47058: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853186.4703882-9060-273767577301414 `" && echo ansible-tmp-1726853186.4703882-9060-273767577301414="` echo /root/.ansible/tmp/ansible-tmp-1726853186.4703882-9060-273767577301414 `" ) && sleep 0' 7554 1726853186.47939: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853186.47972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853186.48059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853186.50023: stdout chunk (state=3): >>>ansible-tmp-1726853186.4703882-9060-273767577301414=/root/.ansible/tmp/ansible-tmp-1726853186.4703882-9060-273767577301414 <<< 7554 1726853186.50206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853186.50210: stdout chunk (state=3): >>><<< 7554 1726853186.50212: stderr chunk (state=3): >>><<< 7554 1726853186.50377: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853186.4703882-9060-273767577301414=/root/.ansible/tmp/ansible-tmp-1726853186.4703882-9060-273767577301414 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853186.50381: variable 'ansible_module_compression' from source: unknown 7554 1726853186.50384: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7554 1726853186.50412: variable 'ansible_facts' from source: unknown 7554 1726853186.50525: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853186.4703882-9060-273767577301414/AnsiballZ_stat.py 7554 1726853186.50828: Sending initial data 7554 1726853186.50832: Sent initial data (151 bytes) 7554 1726853186.51295: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853186.51304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853186.51315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853186.51330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853186.51389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853186.51437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853186.51448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853186.51474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853186.51560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853186.53177: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853186.53270: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853186.53350: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmptxk3y0_a /root/.ansible/tmp/ansible-tmp-1726853186.4703882-9060-273767577301414/AnsiballZ_stat.py <<< 7554 1726853186.53354: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853186.4703882-9060-273767577301414/AnsiballZ_stat.py" <<< 7554 1726853186.53410: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmptxk3y0_a" to remote "/root/.ansible/tmp/ansible-tmp-1726853186.4703882-9060-273767577301414/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853186.4703882-9060-273767577301414/AnsiballZ_stat.py" <<< 7554 1726853186.54228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853186.54259: stderr chunk (state=3): >>><<< 7554 1726853186.54268: stdout chunk (state=3): >>><<< 7554 1726853186.54307: done transferring module to remote 7554 1726853186.54322: _low_level_execute_command(): starting 7554 1726853186.54341: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853186.4703882-9060-273767577301414/ /root/.ansible/tmp/ansible-tmp-1726853186.4703882-9060-273767577301414/AnsiballZ_stat.py && sleep 0' 7554 1726853186.54890: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853186.54910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853186.54913: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853186.54916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853186.54965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853186.54969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853186.55036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853186.57009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853186.57012: stdout chunk (state=3): >>><<< 7554 1726853186.57020: stderr chunk (state=3): >>><<< 7554 1726853186.57022: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853186.57026: _low_level_execute_command(): starting 7554 1726853186.57029: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853186.4703882-9060-273767577301414/AnsiballZ_stat.py && sleep 0' 7554 1726853186.57507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853186.57510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853186.57512: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853186.57515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853186.57517: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853186.57559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853186.57564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853186.57648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853186.73196: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7554 1726853186.74583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853186.74610: stderr chunk (state=3): >>><<< 7554 1726853186.74613: stdout chunk (state=3): >>><<< 7554 1726853186.74633: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853186.74656: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853186.4703882-9060-273767577301414/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853186.74664: _low_level_execute_command(): starting 7554 1726853186.74669: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853186.4703882-9060-273767577301414/ > /dev/null 2>&1 && sleep 0' 7554 1726853186.75132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853186.75135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853186.75138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853186.75140: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853186.75142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853186.75198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853186.75201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853186.75208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853186.75267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853186.77137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853186.77163: stderr chunk (state=3): >>><<< 7554 1726853186.77173: stdout chunk (state=3): >>><<< 7554 1726853186.77185: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853186.77193: handler run complete 7554 1726853186.77211: attempt loop complete, returning result 7554 1726853186.77214: _execute() done 7554 1726853186.77217: dumping result to json 7554 1726853186.77219: done dumping result, returning 7554 1726853186.77226: done running TaskExecutor() for managed_node3/TASK: Stat profile file [02083763-bbaf-bdc3-98b6-000000001960] 7554 1726853186.77231: sending task result for task 02083763-bbaf-bdc3-98b6-000000001960 7554 1726853186.77327: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001960 7554 1726853186.77330: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 7554 1726853186.77384: no more pending results, returning what we have 7554 1726853186.77388: results queue empty 7554 1726853186.77388: checking for any_errors_fatal 7554 1726853186.77398: done checking for any_errors_fatal 7554 1726853186.77399: checking for max_fail_percentage 7554 1726853186.77400: done checking for max_fail_percentage 7554 1726853186.77401: checking to see if all hosts have failed and the running result is not ok 7554 1726853186.77402: done checking to see if all hosts have failed 7554 1726853186.77403: getting the remaining hosts for this loop 7554 1726853186.77404: done getting the remaining hosts for this loop 7554 1726853186.77407: getting the next task for host managed_node3 7554 1726853186.77413: done getting next task for host managed_node3 7554 1726853186.77415: ^ task is: TASK: Set NM profile exist flag based on the profile files 7554 1726853186.77419: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853186.77424: getting variables 7554 1726853186.77426: in VariableManager get_vars() 7554 1726853186.77477: Calling all_inventory to load vars for managed_node3 7554 1726853186.77480: Calling groups_inventory to load vars for managed_node3 7554 1726853186.77482: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853186.77493: Calling all_plugins_play to load vars for managed_node3 7554 1726853186.77496: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853186.77498: Calling groups_plugins_play to load vars for managed_node3 7554 1726853186.78422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853186.79267: done with get_vars() 7554 1726853186.79285: done getting variables 7554 1726853186.79328: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:26:26 -0400 (0:00:00.366) 0:00:40.761 ****** 7554 1726853186.79351: entering _queue_task() for managed_node3/set_fact 7554 1726853186.79589: worker is 1 (out of 1 available) 7554 1726853186.79602: exiting _queue_task() for managed_node3/set_fact 7554 1726853186.79616: done queuing things up, now waiting for results queue to drain 7554 1726853186.79618: waiting for pending results... 7554 1726853186.79805: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 7554 1726853186.79889: in run() - task 02083763-bbaf-bdc3-98b6-000000001961 7554 1726853186.79900: variable 'ansible_search_path' from source: unknown 7554 1726853186.79903: variable 'ansible_search_path' from source: unknown 7554 1726853186.79930: calling self._execute() 7554 1726853186.80011: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853186.80015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853186.80023: variable 'omit' from source: magic vars 7554 1726853186.80314: variable 'ansible_distribution_major_version' from source: facts 7554 1726853186.80324: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853186.80412: variable 'profile_stat' from source: set_fact 7554 1726853186.80423: Evaluated conditional (profile_stat.stat.exists): False 7554 1726853186.80427: when evaluation is False, skipping this task 7554 1726853186.80429: _execute() done 7554 1726853186.80432: dumping result to json 7554 1726853186.80435: done dumping result, returning 7554 1726853186.80440: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-bdc3-98b6-000000001961] 7554 1726853186.80448: sending task result for task 02083763-bbaf-bdc3-98b6-000000001961 7554 1726853186.80532: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001961 7554 1726853186.80536: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7554 1726853186.80582: no more pending results, returning what we have 7554 1726853186.80586: results queue empty 7554 1726853186.80587: checking for any_errors_fatal 7554 1726853186.80596: done checking for any_errors_fatal 7554 1726853186.80597: checking for max_fail_percentage 7554 1726853186.80598: done checking for max_fail_percentage 7554 1726853186.80599: checking to see if all hosts have failed and the running result is not ok 7554 1726853186.80600: done checking to see if all hosts have failed 7554 1726853186.80601: getting the remaining hosts for this loop 7554 1726853186.80603: done getting the remaining hosts for this loop 7554 1726853186.80606: getting the next task for host managed_node3 7554 1726853186.80612: done getting next task for host managed_node3 7554 1726853186.80614: ^ task is: TASK: Get NM profile info 7554 1726853186.80617: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853186.80622: getting variables 7554 1726853186.80623: in VariableManager get_vars() 7554 1726853186.80673: Calling all_inventory to load vars for managed_node3 7554 1726853186.80678: Calling groups_inventory to load vars for managed_node3 7554 1726853186.80680: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853186.80691: Calling all_plugins_play to load vars for managed_node3 7554 1726853186.80693: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853186.80696: Calling groups_plugins_play to load vars for managed_node3 7554 1726853186.81464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853186.82429: done with get_vars() 7554 1726853186.82444: done getting variables 7554 1726853186.82487: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:26:26 -0400 (0:00:00.031) 0:00:40.792 ****** 7554 1726853186.82511: entering _queue_task() for managed_node3/shell 7554 1726853186.82739: worker is 1 (out of 1 available) 7554 1726853186.82754: exiting _queue_task() for managed_node3/shell 7554 1726853186.82768: done queuing things up, now waiting for results queue to drain 7554 1726853186.82770: waiting for pending results... 7554 1726853186.82944: running TaskExecutor() for managed_node3/TASK: Get NM profile info 7554 1726853186.83027: in run() - task 02083763-bbaf-bdc3-98b6-000000001962 7554 1726853186.83040: variable 'ansible_search_path' from source: unknown 7554 1726853186.83043: variable 'ansible_search_path' from source: unknown 7554 1726853186.83073: calling self._execute() 7554 1726853186.83146: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853186.83154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853186.83161: variable 'omit' from source: magic vars 7554 1726853186.83436: variable 'ansible_distribution_major_version' from source: facts 7554 1726853186.83442: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853186.83450: variable 'omit' from source: magic vars 7554 1726853186.83483: variable 'omit' from source: magic vars 7554 1726853186.83556: variable 'profile' from source: include params 7554 1726853186.83560: variable 'interface' from source: play vars 7554 1726853186.83606: variable 'interface' from source: play vars 7554 1726853186.83621: variable 'omit' from source: magic vars 7554 1726853186.83660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853186.83687: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853186.83704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853186.83717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853186.83727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853186.83755: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853186.83758: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853186.83761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853186.83831: Set connection var ansible_shell_executable to /bin/sh 7554 1726853186.83838: Set connection var ansible_pipelining to False 7554 1726853186.83841: Set connection var ansible_shell_type to sh 7554 1726853186.83844: Set connection var ansible_connection to ssh 7554 1726853186.83853: Set connection var ansible_timeout to 10 7554 1726853186.83858: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853186.83880: variable 'ansible_shell_executable' from source: unknown 7554 1726853186.83884: variable 'ansible_connection' from source: unknown 7554 1726853186.83886: variable 'ansible_module_compression' from source: unknown 7554 1726853186.83888: variable 'ansible_shell_type' from source: unknown 7554 1726853186.83891: variable 'ansible_shell_executable' from source: unknown 7554 1726853186.83893: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853186.83895: variable 'ansible_pipelining' from source: unknown 7554 1726853186.83897: variable 'ansible_timeout' from source: unknown 7554 1726853186.83899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853186.84002: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853186.84011: variable 'omit' from source: magic vars 7554 1726853186.84016: starting attempt loop 7554 1726853186.84019: running the handler 7554 1726853186.84028: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853186.84049: _low_level_execute_command(): starting 7554 1726853186.84055: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853186.84545: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853186.84584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853186.84588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853186.84590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853186.84593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853186.84639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853186.84645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853186.84647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853186.84715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853186.86426: stdout chunk (state=3): >>>/root <<< 7554 1726853186.86529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853186.86553: stderr chunk (state=3): >>><<< 7554 1726853186.86556: stdout chunk (state=3): >>><<< 7554 1726853186.86579: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853186.86590: _low_level_execute_command(): starting 7554 1726853186.86595: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853186.8657875-9077-189580610117987 `" && echo ansible-tmp-1726853186.8657875-9077-189580610117987="` echo /root/.ansible/tmp/ansible-tmp-1726853186.8657875-9077-189580610117987 `" ) && sleep 0' 7554 1726853186.87037: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853186.87040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853186.87043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853186.87045: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853186.87047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853186.87097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853186.87101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853186.87174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853186.89134: stdout chunk (state=3): >>>ansible-tmp-1726853186.8657875-9077-189580610117987=/root/.ansible/tmp/ansible-tmp-1726853186.8657875-9077-189580610117987 <<< 7554 1726853186.89248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853186.89268: stderr chunk (state=3): >>><<< 7554 1726853186.89273: stdout chunk (state=3): >>><<< 7554 1726853186.89289: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853186.8657875-9077-189580610117987=/root/.ansible/tmp/ansible-tmp-1726853186.8657875-9077-189580610117987 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853186.89313: variable 'ansible_module_compression' from source: unknown 7554 1726853186.89356: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853186.89385: variable 'ansible_facts' from source: unknown 7554 1726853186.89441: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853186.8657875-9077-189580610117987/AnsiballZ_command.py 7554 1726853186.89532: Sending initial data 7554 1726853186.89535: Sent initial data (154 bytes) 7554 1726853186.89990: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853186.90026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853186.90181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853186.90226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853186.90321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853186.91958: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853186.92039: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853186.92117: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp82x0ygmv /root/.ansible/tmp/ansible-tmp-1726853186.8657875-9077-189580610117987/AnsiballZ_command.py <<< 7554 1726853186.92140: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853186.8657875-9077-189580610117987/AnsiballZ_command.py" <<< 7554 1726853186.92209: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp82x0ygmv" to remote "/root/.ansible/tmp/ansible-tmp-1726853186.8657875-9077-189580610117987/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853186.8657875-9077-189580610117987/AnsiballZ_command.py" <<< 7554 1726853186.93229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853186.93233: stdout chunk (state=3): >>><<< 7554 1726853186.93235: stderr chunk (state=3): >>><<< 7554 1726853186.93237: done transferring module to remote 7554 1726853186.93240: _low_level_execute_command(): starting 7554 1726853186.93246: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853186.8657875-9077-189580610117987/ /root/.ansible/tmp/ansible-tmp-1726853186.8657875-9077-189580610117987/AnsiballZ_command.py && sleep 0' 7554 1726853186.93957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853186.94027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853186.94074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853186.94126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853186.96039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853186.96056: stdout chunk (state=3): >>><<< 7554 1726853186.96168: stderr chunk (state=3): >>><<< 7554 1726853186.96174: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853186.96177: _low_level_execute_command(): starting 7554 1726853186.96180: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853186.8657875-9077-189580610117987/AnsiballZ_command.py && sleep 0' 7554 1726853186.96684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853186.96785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853186.96805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853186.96822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853186.97113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853187.14957: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 13:26:27.128338", "end": "2024-09-20 13:26:27.146384", "delta": "0:00:00.018046", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853187.16504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853187.16580: stdout chunk (state=3): >>><<< 7554 1726853187.16583: stderr chunk (state=3): >>><<< 7554 1726853187.16587: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 13:26:27.128338", "end": "2024-09-20 13:26:27.146384", "delta": "0:00:00.018046", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853187.16608: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853186.8657875-9077-189580610117987/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853187.16630: _low_level_execute_command(): starting 7554 1726853187.16639: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853186.8657875-9077-189580610117987/ > /dev/null 2>&1 && sleep 0' 7554 1726853187.17291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853187.17308: stderr chunk (state=3): >>>debug2: match found <<< 7554 1726853187.17398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853187.17418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853187.17437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853187.17525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853187.19481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853187.19485: stdout chunk (state=3): >>><<< 7554 1726853187.19491: stderr chunk (state=3): >>><<< 7554 1726853187.19507: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853187.19515: handler run complete 7554 1726853187.19677: Evaluated conditional (False): False 7554 1726853187.19681: attempt loop complete, returning result 7554 1726853187.19684: _execute() done 7554 1726853187.19686: dumping result to json 7554 1726853187.19689: done dumping result, returning 7554 1726853187.19691: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [02083763-bbaf-bdc3-98b6-000000001962] 7554 1726853187.19693: sending task result for task 02083763-bbaf-bdc3-98b6-000000001962 7554 1726853187.19783: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001962 7554 1726853187.19787: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.018046", "end": "2024-09-20 13:26:27.146384", "rc": 0, "start": "2024-09-20 13:26:27.128338" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 7554 1726853187.19887: no more pending results, returning what we have 7554 1726853187.19892: results queue empty 7554 1726853187.19893: checking for any_errors_fatal 7554 1726853187.19904: done checking for any_errors_fatal 7554 1726853187.19905: checking for max_fail_percentage 7554 1726853187.19906: done checking for max_fail_percentage 7554 1726853187.19908: checking to see if all hosts have failed and the running result is not ok 7554 1726853187.19908: done checking to see if all hosts have failed 7554 1726853187.19909: getting the remaining hosts for this loop 7554 1726853187.19911: done getting the remaining hosts for this loop 7554 1726853187.19914: getting the next task for host managed_node3 7554 1726853187.19921: done getting next task for host managed_node3 7554 1726853187.19924: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7554 1726853187.19928: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853187.19933: getting variables 7554 1726853187.19934: in VariableManager get_vars() 7554 1726853187.20000: Calling all_inventory to load vars for managed_node3 7554 1726853187.20003: Calling groups_inventory to load vars for managed_node3 7554 1726853187.20006: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853187.20017: Calling all_plugins_play to load vars for managed_node3 7554 1726853187.20021: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853187.20024: Calling groups_plugins_play to load vars for managed_node3 7554 1726853187.21643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853187.23274: done with get_vars() 7554 1726853187.23306: done getting variables 7554 1726853187.23377: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:26:27 -0400 (0:00:00.408) 0:00:41.201 ****** 7554 1726853187.23410: entering _queue_task() for managed_node3/set_fact 7554 1726853187.23903: worker is 1 (out of 1 available) 7554 1726853187.23913: exiting _queue_task() for managed_node3/set_fact 7554 1726853187.23926: done queuing things up, now waiting for results queue to drain 7554 1726853187.23927: waiting for pending results... 7554 1726853187.24124: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7554 1726853187.24233: in run() - task 02083763-bbaf-bdc3-98b6-000000001963 7554 1726853187.24249: variable 'ansible_search_path' from source: unknown 7554 1726853187.24253: variable 'ansible_search_path' from source: unknown 7554 1726853187.24311: calling self._execute() 7554 1726853187.24394: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.24418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.24423: variable 'omit' from source: magic vars 7554 1726853187.24793: variable 'ansible_distribution_major_version' from source: facts 7554 1726853187.24831: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853187.25007: variable 'nm_profile_exists' from source: set_fact 7554 1726853187.25010: Evaluated conditional (nm_profile_exists.rc == 0): True 7554 1726853187.25012: variable 'omit' from source: magic vars 7554 1726853187.25015: variable 'omit' from source: magic vars 7554 1726853187.25050: variable 'omit' from source: magic vars 7554 1726853187.25094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853187.25129: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853187.25158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853187.25179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853187.25187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853187.25223: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853187.25226: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.25228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.25332: Set connection var ansible_shell_executable to /bin/sh 7554 1726853187.25337: Set connection var ansible_pipelining to False 7554 1726853187.25340: Set connection var ansible_shell_type to sh 7554 1726853187.25346: Set connection var ansible_connection to ssh 7554 1726853187.25388: Set connection var ansible_timeout to 10 7554 1726853187.25391: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853187.25394: variable 'ansible_shell_executable' from source: unknown 7554 1726853187.25397: variable 'ansible_connection' from source: unknown 7554 1726853187.25399: variable 'ansible_module_compression' from source: unknown 7554 1726853187.25401: variable 'ansible_shell_type' from source: unknown 7554 1726853187.25403: variable 'ansible_shell_executable' from source: unknown 7554 1726853187.25405: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.25407: variable 'ansible_pipelining' from source: unknown 7554 1726853187.25409: variable 'ansible_timeout' from source: unknown 7554 1726853187.25412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.25577: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853187.25581: variable 'omit' from source: magic vars 7554 1726853187.25583: starting attempt loop 7554 1726853187.25586: running the handler 7554 1726853187.25598: handler run complete 7554 1726853187.25661: attempt loop complete, returning result 7554 1726853187.25664: _execute() done 7554 1726853187.25666: dumping result to json 7554 1726853187.25668: done dumping result, returning 7554 1726853187.25672: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-bdc3-98b6-000000001963] 7554 1726853187.25674: sending task result for task 02083763-bbaf-bdc3-98b6-000000001963 7554 1726853187.25737: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001963 7554 1726853187.25740: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 7554 1726853187.25841: no more pending results, returning what we have 7554 1726853187.25845: results queue empty 7554 1726853187.25846: checking for any_errors_fatal 7554 1726853187.25857: done checking for any_errors_fatal 7554 1726853187.25858: checking for max_fail_percentage 7554 1726853187.25859: done checking for max_fail_percentage 7554 1726853187.25860: checking to see if all hosts have failed and the running result is not ok 7554 1726853187.25862: done checking to see if all hosts have failed 7554 1726853187.25863: getting the remaining hosts for this loop 7554 1726853187.25864: done getting the remaining hosts for this loop 7554 1726853187.25868: getting the next task for host managed_node3 7554 1726853187.25879: done getting next task for host managed_node3 7554 1726853187.25883: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 7554 1726853187.25887: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853187.25893: getting variables 7554 1726853187.25895: in VariableManager get_vars() 7554 1726853187.25949: Calling all_inventory to load vars for managed_node3 7554 1726853187.25953: Calling groups_inventory to load vars for managed_node3 7554 1726853187.25955: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853187.25967: Calling all_plugins_play to load vars for managed_node3 7554 1726853187.25970: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853187.26185: Calling groups_plugins_play to load vars for managed_node3 7554 1726853187.27189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853187.28039: done with get_vars() 7554 1726853187.28060: done getting variables 7554 1726853187.28126: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853187.28252: variable 'profile' from source: include params 7554 1726853187.28256: variable 'interface' from source: play vars 7554 1726853187.28315: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:26:27 -0400 (0:00:00.049) 0:00:41.251 ****** 7554 1726853187.28352: entering _queue_task() for managed_node3/command 7554 1726853187.28722: worker is 1 (out of 1 available) 7554 1726853187.28734: exiting _queue_task() for managed_node3/command 7554 1726853187.28748: done queuing things up, now waiting for results queue to drain 7554 1726853187.28749: waiting for pending results... 7554 1726853187.29082: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 7554 1726853187.29159: in run() - task 02083763-bbaf-bdc3-98b6-000000001965 7554 1726853187.29175: variable 'ansible_search_path' from source: unknown 7554 1726853187.29179: variable 'ansible_search_path' from source: unknown 7554 1726853187.29204: calling self._execute() 7554 1726853187.29283: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.29289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.29298: variable 'omit' from source: magic vars 7554 1726853187.29580: variable 'ansible_distribution_major_version' from source: facts 7554 1726853187.29588: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853187.29673: variable 'profile_stat' from source: set_fact 7554 1726853187.29686: Evaluated conditional (profile_stat.stat.exists): False 7554 1726853187.29690: when evaluation is False, skipping this task 7554 1726853187.29693: _execute() done 7554 1726853187.29695: dumping result to json 7554 1726853187.29698: done dumping result, returning 7554 1726853187.29704: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 [02083763-bbaf-bdc3-98b6-000000001965] 7554 1726853187.29714: sending task result for task 02083763-bbaf-bdc3-98b6-000000001965 7554 1726853187.29791: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001965 7554 1726853187.29794: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7554 1726853187.29868: no more pending results, returning what we have 7554 1726853187.29873: results queue empty 7554 1726853187.29874: checking for any_errors_fatal 7554 1726853187.29883: done checking for any_errors_fatal 7554 1726853187.29883: checking for max_fail_percentage 7554 1726853187.29885: done checking for max_fail_percentage 7554 1726853187.29886: checking to see if all hosts have failed and the running result is not ok 7554 1726853187.29887: done checking to see if all hosts have failed 7554 1726853187.29887: getting the remaining hosts for this loop 7554 1726853187.29889: done getting the remaining hosts for this loop 7554 1726853187.29892: getting the next task for host managed_node3 7554 1726853187.29897: done getting next task for host managed_node3 7554 1726853187.29900: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 7554 1726853187.29903: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853187.29907: getting variables 7554 1726853187.29908: in VariableManager get_vars() 7554 1726853187.29955: Calling all_inventory to load vars for managed_node3 7554 1726853187.29957: Calling groups_inventory to load vars for managed_node3 7554 1726853187.29959: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853187.29976: Calling all_plugins_play to load vars for managed_node3 7554 1726853187.29980: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853187.29983: Calling groups_plugins_play to load vars for managed_node3 7554 1726853187.30754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853187.32046: done with get_vars() 7554 1726853187.32073: done getting variables 7554 1726853187.32129: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853187.32241: variable 'profile' from source: include params 7554 1726853187.32245: variable 'interface' from source: play vars 7554 1726853187.32307: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:26:27 -0400 (0:00:00.039) 0:00:41.290 ****** 7554 1726853187.32339: entering _queue_task() for managed_node3/set_fact 7554 1726853187.32639: worker is 1 (out of 1 available) 7554 1726853187.32652: exiting _queue_task() for managed_node3/set_fact 7554 1726853187.32666: done queuing things up, now waiting for results queue to drain 7554 1726853187.32668: waiting for pending results... 7554 1726853187.32867: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 7554 1726853187.32963: in run() - task 02083763-bbaf-bdc3-98b6-000000001966 7554 1726853187.32976: variable 'ansible_search_path' from source: unknown 7554 1726853187.32980: variable 'ansible_search_path' from source: unknown 7554 1726853187.33011: calling self._execute() 7554 1726853187.33087: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.33093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.33101: variable 'omit' from source: magic vars 7554 1726853187.33374: variable 'ansible_distribution_major_version' from source: facts 7554 1726853187.33385: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853187.33468: variable 'profile_stat' from source: set_fact 7554 1726853187.33482: Evaluated conditional (profile_stat.stat.exists): False 7554 1726853187.33485: when evaluation is False, skipping this task 7554 1726853187.33488: _execute() done 7554 1726853187.33490: dumping result to json 7554 1726853187.33493: done dumping result, returning 7554 1726853187.33499: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 [02083763-bbaf-bdc3-98b6-000000001966] 7554 1726853187.33505: sending task result for task 02083763-bbaf-bdc3-98b6-000000001966 7554 1726853187.33588: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001966 7554 1726853187.33592: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7554 1726853187.33635: no more pending results, returning what we have 7554 1726853187.33638: results queue empty 7554 1726853187.33639: checking for any_errors_fatal 7554 1726853187.33649: done checking for any_errors_fatal 7554 1726853187.33650: checking for max_fail_percentage 7554 1726853187.33651: done checking for max_fail_percentage 7554 1726853187.33652: checking to see if all hosts have failed and the running result is not ok 7554 1726853187.33653: done checking to see if all hosts have failed 7554 1726853187.33654: getting the remaining hosts for this loop 7554 1726853187.33655: done getting the remaining hosts for this loop 7554 1726853187.33658: getting the next task for host managed_node3 7554 1726853187.33665: done getting next task for host managed_node3 7554 1726853187.33668: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 7554 1726853187.33673: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853187.33678: getting variables 7554 1726853187.33680: in VariableManager get_vars() 7554 1726853187.33724: Calling all_inventory to load vars for managed_node3 7554 1726853187.33726: Calling groups_inventory to load vars for managed_node3 7554 1726853187.33728: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853187.33739: Calling all_plugins_play to load vars for managed_node3 7554 1726853187.33745: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853187.33747: Calling groups_plugins_play to load vars for managed_node3 7554 1726853187.35127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853187.36331: done with get_vars() 7554 1726853187.36347: done getting variables 7554 1726853187.36393: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853187.36472: variable 'profile' from source: include params 7554 1726853187.36475: variable 'interface' from source: play vars 7554 1726853187.36516: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:26:27 -0400 (0:00:00.041) 0:00:41.332 ****** 7554 1726853187.36539: entering _queue_task() for managed_node3/command 7554 1726853187.36767: worker is 1 (out of 1 available) 7554 1726853187.36781: exiting _queue_task() for managed_node3/command 7554 1726853187.36793: done queuing things up, now waiting for results queue to drain 7554 1726853187.36794: waiting for pending results... 7554 1726853187.36976: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 7554 1726853187.37061: in run() - task 02083763-bbaf-bdc3-98b6-000000001967 7554 1726853187.37074: variable 'ansible_search_path' from source: unknown 7554 1726853187.37079: variable 'ansible_search_path' from source: unknown 7554 1726853187.37105: calling self._execute() 7554 1726853187.37182: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.37186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.37195: variable 'omit' from source: magic vars 7554 1726853187.37461: variable 'ansible_distribution_major_version' from source: facts 7554 1726853187.37474: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853187.37633: variable 'profile_stat' from source: set_fact 7554 1726853187.37637: Evaluated conditional (profile_stat.stat.exists): False 7554 1726853187.37640: when evaluation is False, skipping this task 7554 1726853187.37643: _execute() done 7554 1726853187.37645: dumping result to json 7554 1726853187.37648: done dumping result, returning 7554 1726853187.37650: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 [02083763-bbaf-bdc3-98b6-000000001967] 7554 1726853187.37652: sending task result for task 02083763-bbaf-bdc3-98b6-000000001967 7554 1726853187.37709: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001967 7554 1726853187.37711: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7554 1726853187.37759: no more pending results, returning what we have 7554 1726853187.37763: results queue empty 7554 1726853187.37764: checking for any_errors_fatal 7554 1726853187.37775: done checking for any_errors_fatal 7554 1726853187.37775: checking for max_fail_percentage 7554 1726853187.37777: done checking for max_fail_percentage 7554 1726853187.37778: checking to see if all hosts have failed and the running result is not ok 7554 1726853187.37779: done checking to see if all hosts have failed 7554 1726853187.37779: getting the remaining hosts for this loop 7554 1726853187.37781: done getting the remaining hosts for this loop 7554 1726853187.37784: getting the next task for host managed_node3 7554 1726853187.37790: done getting next task for host managed_node3 7554 1726853187.37793: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 7554 1726853187.37796: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853187.37800: getting variables 7554 1726853187.37802: in VariableManager get_vars() 7554 1726853187.37845: Calling all_inventory to load vars for managed_node3 7554 1726853187.37848: Calling groups_inventory to load vars for managed_node3 7554 1726853187.37850: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853187.37861: Calling all_plugins_play to load vars for managed_node3 7554 1726853187.37863: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853187.37866: Calling groups_plugins_play to load vars for managed_node3 7554 1726853187.39119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853187.39976: done with get_vars() 7554 1726853187.39994: done getting variables 7554 1726853187.40038: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853187.40119: variable 'profile' from source: include params 7554 1726853187.40123: variable 'interface' from source: play vars 7554 1726853187.40164: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:26:27 -0400 (0:00:00.036) 0:00:41.369 ****** 7554 1726853187.40189: entering _queue_task() for managed_node3/set_fact 7554 1726853187.40438: worker is 1 (out of 1 available) 7554 1726853187.40451: exiting _queue_task() for managed_node3/set_fact 7554 1726853187.40466: done queuing things up, now waiting for results queue to drain 7554 1726853187.40467: waiting for pending results... 7554 1726853187.40645: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 7554 1726853187.40734: in run() - task 02083763-bbaf-bdc3-98b6-000000001968 7554 1726853187.40748: variable 'ansible_search_path' from source: unknown 7554 1726853187.40753: variable 'ansible_search_path' from source: unknown 7554 1726853187.40781: calling self._execute() 7554 1726853187.40857: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.40861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.40869: variable 'omit' from source: magic vars 7554 1726853187.41137: variable 'ansible_distribution_major_version' from source: facts 7554 1726853187.41150: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853187.41231: variable 'profile_stat' from source: set_fact 7554 1726853187.41246: Evaluated conditional (profile_stat.stat.exists): False 7554 1726853187.41250: when evaluation is False, skipping this task 7554 1726853187.41253: _execute() done 7554 1726853187.41256: dumping result to json 7554 1726853187.41260: done dumping result, returning 7554 1726853187.41266: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 [02083763-bbaf-bdc3-98b6-000000001968] 7554 1726853187.41272: sending task result for task 02083763-bbaf-bdc3-98b6-000000001968 7554 1726853187.41349: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001968 7554 1726853187.41351: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7554 1726853187.41398: no more pending results, returning what we have 7554 1726853187.41401: results queue empty 7554 1726853187.41402: checking for any_errors_fatal 7554 1726853187.41408: done checking for any_errors_fatal 7554 1726853187.41408: checking for max_fail_percentage 7554 1726853187.41410: done checking for max_fail_percentage 7554 1726853187.41411: checking to see if all hosts have failed and the running result is not ok 7554 1726853187.41412: done checking to see if all hosts have failed 7554 1726853187.41412: getting the remaining hosts for this loop 7554 1726853187.41414: done getting the remaining hosts for this loop 7554 1726853187.41417: getting the next task for host managed_node3 7554 1726853187.41425: done getting next task for host managed_node3 7554 1726853187.41429: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 7554 1726853187.41432: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853187.41437: getting variables 7554 1726853187.41438: in VariableManager get_vars() 7554 1726853187.41486: Calling all_inventory to load vars for managed_node3 7554 1726853187.41488: Calling groups_inventory to load vars for managed_node3 7554 1726853187.41490: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853187.41501: Calling all_plugins_play to load vars for managed_node3 7554 1726853187.41503: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853187.41506: Calling groups_plugins_play to load vars for managed_node3 7554 1726853187.42370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853187.43221: done with get_vars() 7554 1726853187.43236: done getting variables 7554 1726853187.43279: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853187.43356: variable 'profile' from source: include params 7554 1726853187.43359: variable 'interface' from source: play vars 7554 1726853187.43398: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:26:27 -0400 (0:00:00.032) 0:00:41.401 ****** 7554 1726853187.43423: entering _queue_task() for managed_node3/assert 7554 1726853187.43645: worker is 1 (out of 1 available) 7554 1726853187.43661: exiting _queue_task() for managed_node3/assert 7554 1726853187.43675: done queuing things up, now waiting for results queue to drain 7554 1726853187.43677: waiting for pending results... 7554 1726853187.43850: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' 7554 1726853187.43924: in run() - task 02083763-bbaf-bdc3-98b6-0000000016d3 7554 1726853187.43935: variable 'ansible_search_path' from source: unknown 7554 1726853187.43940: variable 'ansible_search_path' from source: unknown 7554 1726853187.43969: calling self._execute() 7554 1726853187.44046: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.44053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.44063: variable 'omit' from source: magic vars 7554 1726853187.44326: variable 'ansible_distribution_major_version' from source: facts 7554 1726853187.44341: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853187.44345: variable 'omit' from source: magic vars 7554 1726853187.44374: variable 'omit' from source: magic vars 7554 1726853187.44438: variable 'profile' from source: include params 7554 1726853187.44444: variable 'interface' from source: play vars 7554 1726853187.44493: variable 'interface' from source: play vars 7554 1726853187.44508: variable 'omit' from source: magic vars 7554 1726853187.44540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853187.44572: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853187.44590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853187.44603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853187.44613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853187.44638: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853187.44641: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.44643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.44717: Set connection var ansible_shell_executable to /bin/sh 7554 1726853187.44724: Set connection var ansible_pipelining to False 7554 1726853187.44727: Set connection var ansible_shell_type to sh 7554 1726853187.44730: Set connection var ansible_connection to ssh 7554 1726853187.44737: Set connection var ansible_timeout to 10 7554 1726853187.44741: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853187.44761: variable 'ansible_shell_executable' from source: unknown 7554 1726853187.44764: variable 'ansible_connection' from source: unknown 7554 1726853187.44766: variable 'ansible_module_compression' from source: unknown 7554 1726853187.44769: variable 'ansible_shell_type' from source: unknown 7554 1726853187.44774: variable 'ansible_shell_executable' from source: unknown 7554 1726853187.44777: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.44780: variable 'ansible_pipelining' from source: unknown 7554 1726853187.44783: variable 'ansible_timeout' from source: unknown 7554 1726853187.44785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.44884: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853187.44895: variable 'omit' from source: magic vars 7554 1726853187.44898: starting attempt loop 7554 1726853187.44901: running the handler 7554 1726853187.44978: variable 'lsr_net_profile_exists' from source: set_fact 7554 1726853187.44982: Evaluated conditional (lsr_net_profile_exists): True 7554 1726853187.44988: handler run complete 7554 1726853187.44999: attempt loop complete, returning result 7554 1726853187.45002: _execute() done 7554 1726853187.45005: dumping result to json 7554 1726853187.45007: done dumping result, returning 7554 1726853187.45018: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' [02083763-bbaf-bdc3-98b6-0000000016d3] 7554 1726853187.45022: sending task result for task 02083763-bbaf-bdc3-98b6-0000000016d3 7554 1726853187.45099: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000016d3 7554 1726853187.45102: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7554 1726853187.45178: no more pending results, returning what we have 7554 1726853187.45181: results queue empty 7554 1726853187.45182: checking for any_errors_fatal 7554 1726853187.45187: done checking for any_errors_fatal 7554 1726853187.45188: checking for max_fail_percentage 7554 1726853187.45190: done checking for max_fail_percentage 7554 1726853187.45191: checking to see if all hosts have failed and the running result is not ok 7554 1726853187.45192: done checking to see if all hosts have failed 7554 1726853187.45192: getting the remaining hosts for this loop 7554 1726853187.45194: done getting the remaining hosts for this loop 7554 1726853187.45197: getting the next task for host managed_node3 7554 1726853187.45202: done getting next task for host managed_node3 7554 1726853187.45204: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 7554 1726853187.45206: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853187.45210: getting variables 7554 1726853187.45211: in VariableManager get_vars() 7554 1726853187.45259: Calling all_inventory to load vars for managed_node3 7554 1726853187.45262: Calling groups_inventory to load vars for managed_node3 7554 1726853187.45264: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853187.45275: Calling all_plugins_play to load vars for managed_node3 7554 1726853187.45277: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853187.45280: Calling groups_plugins_play to load vars for managed_node3 7554 1726853187.46031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853187.50683: done with get_vars() 7554 1726853187.50701: done getting variables 7554 1726853187.50735: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853187.50807: variable 'profile' from source: include params 7554 1726853187.50809: variable 'interface' from source: play vars 7554 1726853187.50848: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:26:27 -0400 (0:00:00.074) 0:00:41.476 ****** 7554 1726853187.50873: entering _queue_task() for managed_node3/assert 7554 1726853187.51135: worker is 1 (out of 1 available) 7554 1726853187.51154: exiting _queue_task() for managed_node3/assert 7554 1726853187.51167: done queuing things up, now waiting for results queue to drain 7554 1726853187.51168: waiting for pending results... 7554 1726853187.51352: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' 7554 1726853187.51426: in run() - task 02083763-bbaf-bdc3-98b6-0000000016d4 7554 1726853187.51439: variable 'ansible_search_path' from source: unknown 7554 1726853187.51444: variable 'ansible_search_path' from source: unknown 7554 1726853187.51472: calling self._execute() 7554 1726853187.51556: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.51562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.51570: variable 'omit' from source: magic vars 7554 1726853187.51848: variable 'ansible_distribution_major_version' from source: facts 7554 1726853187.51859: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853187.51866: variable 'omit' from source: magic vars 7554 1726853187.51893: variable 'omit' from source: magic vars 7554 1726853187.51964: variable 'profile' from source: include params 7554 1726853187.51969: variable 'interface' from source: play vars 7554 1726853187.52014: variable 'interface' from source: play vars 7554 1726853187.52029: variable 'omit' from source: magic vars 7554 1726853187.52064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853187.52094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853187.52112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853187.52125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853187.52136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853187.52164: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853187.52167: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.52170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.52237: Set connection var ansible_shell_executable to /bin/sh 7554 1726853187.52247: Set connection var ansible_pipelining to False 7554 1726853187.52250: Set connection var ansible_shell_type to sh 7554 1726853187.52252: Set connection var ansible_connection to ssh 7554 1726853187.52258: Set connection var ansible_timeout to 10 7554 1726853187.52265: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853187.52286: variable 'ansible_shell_executable' from source: unknown 7554 1726853187.52289: variable 'ansible_connection' from source: unknown 7554 1726853187.52291: variable 'ansible_module_compression' from source: unknown 7554 1726853187.52294: variable 'ansible_shell_type' from source: unknown 7554 1726853187.52296: variable 'ansible_shell_executable' from source: unknown 7554 1726853187.52298: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.52300: variable 'ansible_pipelining' from source: unknown 7554 1726853187.52303: variable 'ansible_timeout' from source: unknown 7554 1726853187.52307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.52407: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853187.52415: variable 'omit' from source: magic vars 7554 1726853187.52420: starting attempt loop 7554 1726853187.52423: running the handler 7554 1726853187.52499: variable 'lsr_net_profile_ansible_managed' from source: set_fact 7554 1726853187.52503: Evaluated conditional (lsr_net_profile_ansible_managed): True 7554 1726853187.52508: handler run complete 7554 1726853187.52520: attempt loop complete, returning result 7554 1726853187.52523: _execute() done 7554 1726853187.52525: dumping result to json 7554 1726853187.52528: done dumping result, returning 7554 1726853187.52534: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' [02083763-bbaf-bdc3-98b6-0000000016d4] 7554 1726853187.52540: sending task result for task 02083763-bbaf-bdc3-98b6-0000000016d4 7554 1726853187.52624: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000016d4 7554 1726853187.52626: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7554 1726853187.52675: no more pending results, returning what we have 7554 1726853187.52678: results queue empty 7554 1726853187.52679: checking for any_errors_fatal 7554 1726853187.52686: done checking for any_errors_fatal 7554 1726853187.52686: checking for max_fail_percentage 7554 1726853187.52688: done checking for max_fail_percentage 7554 1726853187.52689: checking to see if all hosts have failed and the running result is not ok 7554 1726853187.52690: done checking to see if all hosts have failed 7554 1726853187.52690: getting the remaining hosts for this loop 7554 1726853187.52692: done getting the remaining hosts for this loop 7554 1726853187.52695: getting the next task for host managed_node3 7554 1726853187.52700: done getting next task for host managed_node3 7554 1726853187.52702: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 7554 1726853187.52705: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853187.52709: getting variables 7554 1726853187.52710: in VariableManager get_vars() 7554 1726853187.52767: Calling all_inventory to load vars for managed_node3 7554 1726853187.52770: Calling groups_inventory to load vars for managed_node3 7554 1726853187.52774: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853187.52784: Calling all_plugins_play to load vars for managed_node3 7554 1726853187.52786: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853187.52788: Calling groups_plugins_play to load vars for managed_node3 7554 1726853187.53577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853187.54460: done with get_vars() 7554 1726853187.54478: done getting variables 7554 1726853187.54516: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853187.54595: variable 'profile' from source: include params 7554 1726853187.54598: variable 'interface' from source: play vars 7554 1726853187.54636: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:26:27 -0400 (0:00:00.037) 0:00:41.514 ****** 7554 1726853187.54663: entering _queue_task() for managed_node3/assert 7554 1726853187.54879: worker is 1 (out of 1 available) 7554 1726853187.54892: exiting _queue_task() for managed_node3/assert 7554 1726853187.54904: done queuing things up, now waiting for results queue to drain 7554 1726853187.54906: waiting for pending results... 7554 1726853187.55085: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 7554 1726853187.55162: in run() - task 02083763-bbaf-bdc3-98b6-0000000016d5 7554 1726853187.55207: variable 'ansible_search_path' from source: unknown 7554 1726853187.55210: variable 'ansible_search_path' from source: unknown 7554 1726853187.55277: calling self._execute() 7554 1726853187.55448: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.55451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.55454: variable 'omit' from source: magic vars 7554 1726853187.55792: variable 'ansible_distribution_major_version' from source: facts 7554 1726853187.55795: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853187.55799: variable 'omit' from source: magic vars 7554 1726853187.55839: variable 'omit' from source: magic vars 7554 1726853187.55956: variable 'profile' from source: include params 7554 1726853187.56011: variable 'interface' from source: play vars 7554 1726853187.56047: variable 'interface' from source: play vars 7554 1726853187.56075: variable 'omit' from source: magic vars 7554 1726853187.56127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853187.56173: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853187.56206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853187.56220: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853187.56237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853187.56262: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853187.56265: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.56268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.56346: Set connection var ansible_shell_executable to /bin/sh 7554 1726853187.56351: Set connection var ansible_pipelining to False 7554 1726853187.56354: Set connection var ansible_shell_type to sh 7554 1726853187.56356: Set connection var ansible_connection to ssh 7554 1726853187.56364: Set connection var ansible_timeout to 10 7554 1726853187.56369: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853187.56388: variable 'ansible_shell_executable' from source: unknown 7554 1726853187.56391: variable 'ansible_connection' from source: unknown 7554 1726853187.56394: variable 'ansible_module_compression' from source: unknown 7554 1726853187.56396: variable 'ansible_shell_type' from source: unknown 7554 1726853187.56399: variable 'ansible_shell_executable' from source: unknown 7554 1726853187.56401: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.56405: variable 'ansible_pipelining' from source: unknown 7554 1726853187.56408: variable 'ansible_timeout' from source: unknown 7554 1726853187.56411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.56509: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853187.56519: variable 'omit' from source: magic vars 7554 1726853187.56524: starting attempt loop 7554 1726853187.56526: running the handler 7554 1726853187.56603: variable 'lsr_net_profile_fingerprint' from source: set_fact 7554 1726853187.56606: Evaluated conditional (lsr_net_profile_fingerprint): True 7554 1726853187.56612: handler run complete 7554 1726853187.56624: attempt loop complete, returning result 7554 1726853187.56626: _execute() done 7554 1726853187.56629: dumping result to json 7554 1726853187.56632: done dumping result, returning 7554 1726853187.56637: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 [02083763-bbaf-bdc3-98b6-0000000016d5] 7554 1726853187.56645: sending task result for task 02083763-bbaf-bdc3-98b6-0000000016d5 7554 1726853187.56724: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000016d5 7554 1726853187.56727: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7554 1726853187.56809: no more pending results, returning what we have 7554 1726853187.56812: results queue empty 7554 1726853187.56813: checking for any_errors_fatal 7554 1726853187.56817: done checking for any_errors_fatal 7554 1726853187.56818: checking for max_fail_percentage 7554 1726853187.56819: done checking for max_fail_percentage 7554 1726853187.56820: checking to see if all hosts have failed and the running result is not ok 7554 1726853187.56821: done checking to see if all hosts have failed 7554 1726853187.56822: getting the remaining hosts for this loop 7554 1726853187.56823: done getting the remaining hosts for this loop 7554 1726853187.56826: getting the next task for host managed_node3 7554 1726853187.56832: done getting next task for host managed_node3 7554 1726853187.56835: ^ task is: TASK: Show ipv4 routes 7554 1726853187.56836: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853187.56840: getting variables 7554 1726853187.56841: in VariableManager get_vars() 7554 1726853187.56885: Calling all_inventory to load vars for managed_node3 7554 1726853187.56887: Calling groups_inventory to load vars for managed_node3 7554 1726853187.56890: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853187.56898: Calling all_plugins_play to load vars for managed_node3 7554 1726853187.56900: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853187.56902: Calling groups_plugins_play to load vars for managed_node3 7554 1726853187.57797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853187.59103: done with get_vars() 7554 1726853187.59124: done getting variables 7554 1726853187.59184: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ipv4 routes] ******************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:114 Friday 20 September 2024 13:26:27 -0400 (0:00:00.045) 0:00:41.559 ****** 7554 1726853187.59212: entering _queue_task() for managed_node3/command 7554 1726853187.59511: worker is 1 (out of 1 available) 7554 1726853187.59524: exiting _queue_task() for managed_node3/command 7554 1726853187.59536: done queuing things up, now waiting for results queue to drain 7554 1726853187.59537: waiting for pending results... 7554 1726853187.59992: running TaskExecutor() for managed_node3/TASK: Show ipv4 routes 7554 1726853187.59998: in run() - task 02083763-bbaf-bdc3-98b6-0000000000ff 7554 1726853187.60002: variable 'ansible_search_path' from source: unknown 7554 1726853187.60005: calling self._execute() 7554 1726853187.60117: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.60136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.60153: variable 'omit' from source: magic vars 7554 1726853187.60556: variable 'ansible_distribution_major_version' from source: facts 7554 1726853187.60579: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853187.60590: variable 'omit' from source: magic vars 7554 1726853187.60613: variable 'omit' from source: magic vars 7554 1726853187.60658: variable 'omit' from source: magic vars 7554 1726853187.60710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853187.60755: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853187.60787: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853187.60809: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853187.60876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853187.60879: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853187.60886: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.60888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.60979: Set connection var ansible_shell_executable to /bin/sh 7554 1726853187.60998: Set connection var ansible_pipelining to False 7554 1726853187.61004: Set connection var ansible_shell_type to sh 7554 1726853187.61010: Set connection var ansible_connection to ssh 7554 1726853187.61024: Set connection var ansible_timeout to 10 7554 1726853187.61033: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853187.61061: variable 'ansible_shell_executable' from source: unknown 7554 1726853187.61068: variable 'ansible_connection' from source: unknown 7554 1726853187.61077: variable 'ansible_module_compression' from source: unknown 7554 1726853187.61102: variable 'ansible_shell_type' from source: unknown 7554 1726853187.61106: variable 'ansible_shell_executable' from source: unknown 7554 1726853187.61109: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.61111: variable 'ansible_pipelining' from source: unknown 7554 1726853187.61113: variable 'ansible_timeout' from source: unknown 7554 1726853187.61115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.61276: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853187.61279: variable 'omit' from source: magic vars 7554 1726853187.61282: starting attempt loop 7554 1726853187.61284: running the handler 7554 1726853187.61302: _low_level_execute_command(): starting 7554 1726853187.61313: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853187.62009: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853187.62013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853187.62017: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853187.62118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853187.62180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853187.63889: stdout chunk (state=3): >>>/root <<< 7554 1726853187.63998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853187.64049: stderr chunk (state=3): >>><<< 7554 1726853187.64052: stdout chunk (state=3): >>><<< 7554 1726853187.64075: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853187.64182: _low_level_execute_command(): starting 7554 1726853187.64188: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853187.6408412-9113-56394760544157 `" && echo ansible-tmp-1726853187.6408412-9113-56394760544157="` echo /root/.ansible/tmp/ansible-tmp-1726853187.6408412-9113-56394760544157 `" ) && sleep 0' 7554 1726853187.64786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853187.64907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853187.64940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853187.65041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853187.67026: stdout chunk (state=3): >>>ansible-tmp-1726853187.6408412-9113-56394760544157=/root/.ansible/tmp/ansible-tmp-1726853187.6408412-9113-56394760544157 <<< 7554 1726853187.67200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853187.67204: stdout chunk (state=3): >>><<< 7554 1726853187.67206: stderr chunk (state=3): >>><<< 7554 1726853187.67376: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853187.6408412-9113-56394760544157=/root/.ansible/tmp/ansible-tmp-1726853187.6408412-9113-56394760544157 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853187.67380: variable 'ansible_module_compression' from source: unknown 7554 1726853187.67383: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853187.67385: variable 'ansible_facts' from source: unknown 7554 1726853187.67457: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853187.6408412-9113-56394760544157/AnsiballZ_command.py 7554 1726853187.67693: Sending initial data 7554 1726853187.67696: Sent initial data (153 bytes) 7554 1726853187.68210: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853187.68221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853187.68232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853187.68248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853187.68264: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853187.68272: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853187.68278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853187.68294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853187.68308: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853187.68318: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853187.68376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853187.68464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853187.68522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853187.70188: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853187.70243: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853187.70296: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpkhgedv7r /root/.ansible/tmp/ansible-tmp-1726853187.6408412-9113-56394760544157/AnsiballZ_command.py <<< 7554 1726853187.70305: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853187.6408412-9113-56394760544157/AnsiballZ_command.py" <<< 7554 1726853187.70353: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpkhgedv7r" to remote "/root/.ansible/tmp/ansible-tmp-1726853187.6408412-9113-56394760544157/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853187.6408412-9113-56394760544157/AnsiballZ_command.py" <<< 7554 1726853187.71076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853187.71180: stderr chunk (state=3): >>><<< 7554 1726853187.71183: stdout chunk (state=3): >>><<< 7554 1726853187.71185: done transferring module to remote 7554 1726853187.71189: _low_level_execute_command(): starting 7554 1726853187.71191: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853187.6408412-9113-56394760544157/ /root/.ansible/tmp/ansible-tmp-1726853187.6408412-9113-56394760544157/AnsiballZ_command.py && sleep 0' 7554 1726853187.71709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853187.71721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853187.71734: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853187.71784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853187.71797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853187.71866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853187.73783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853187.73788: stdout chunk (state=3): >>><<< 7554 1726853187.73790: stderr chunk (state=3): >>><<< 7554 1726853187.73888: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853187.73892: _low_level_execute_command(): starting 7554 1726853187.73895: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853187.6408412-9113-56394760544157/AnsiballZ_command.py && sleep 0' 7554 1726853187.74493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853187.74545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853187.74552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853187.74578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853187.74662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853187.90704: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 101 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-20 13:26:27.901737", "end": "2024-09-20 13:26:27.905603", "delta": "0:00:00.003866", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853187.92393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853187.92418: stderr chunk (state=3): >>><<< 7554 1726853187.92421: stdout chunk (state=3): >>><<< 7554 1726853187.92439: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 101 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-20 13:26:27.901737", "end": "2024-09-20 13:26:27.905603", "delta": "0:00:00.003866", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853187.92473: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853187.6408412-9113-56394760544157/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853187.92482: _low_level_execute_command(): starting 7554 1726853187.92486: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853187.6408412-9113-56394760544157/ > /dev/null 2>&1 && sleep 0' 7554 1726853187.92941: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853187.92947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853187.92950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853187.92952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853187.92954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853187.92995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853187.93016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853187.93072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853187.94957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853187.94982: stderr chunk (state=3): >>><<< 7554 1726853187.94985: stdout chunk (state=3): >>><<< 7554 1726853187.94998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853187.95005: handler run complete 7554 1726853187.95025: Evaluated conditional (False): False 7554 1726853187.95033: attempt loop complete, returning result 7554 1726853187.95036: _execute() done 7554 1726853187.95038: dumping result to json 7554 1726853187.95045: done dumping result, returning 7554 1726853187.95053: done running TaskExecutor() for managed_node3/TASK: Show ipv4 routes [02083763-bbaf-bdc3-98b6-0000000000ff] 7554 1726853187.95059: sending task result for task 02083763-bbaf-bdc3-98b6-0000000000ff 7554 1726853187.95159: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000000ff 7554 1726853187.95162: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "route" ], "delta": "0:00:00.003866", "end": "2024-09-20 13:26:27.905603", "rc": 0, "start": "2024-09-20 13:26:27.901737" } STDOUT: default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 101 7554 1726853187.95231: no more pending results, returning what we have 7554 1726853187.95235: results queue empty 7554 1726853187.95235: checking for any_errors_fatal 7554 1726853187.95244: done checking for any_errors_fatal 7554 1726853187.95245: checking for max_fail_percentage 7554 1726853187.95247: done checking for max_fail_percentage 7554 1726853187.95248: checking to see if all hosts have failed and the running result is not ok 7554 1726853187.95249: done checking to see if all hosts have failed 7554 1726853187.95249: getting the remaining hosts for this loop 7554 1726853187.95250: done getting the remaining hosts for this loop 7554 1726853187.95254: getting the next task for host managed_node3 7554 1726853187.95259: done getting next task for host managed_node3 7554 1726853187.95262: ^ task is: TASK: Assert default ipv4 route is absent 7554 1726853187.95264: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853187.95267: getting variables 7554 1726853187.95269: in VariableManager get_vars() 7554 1726853187.95336: Calling all_inventory to load vars for managed_node3 7554 1726853187.95338: Calling groups_inventory to load vars for managed_node3 7554 1726853187.95340: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853187.95353: Calling all_plugins_play to load vars for managed_node3 7554 1726853187.95356: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853187.95358: Calling groups_plugins_play to load vars for managed_node3 7554 1726853187.96167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853187.97136: done with get_vars() 7554 1726853187.97159: done getting variables 7554 1726853187.97216: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv4 route is absent] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:118 Friday 20 September 2024 13:26:27 -0400 (0:00:00.380) 0:00:41.939 ****** 7554 1726853187.97248: entering _queue_task() for managed_node3/assert 7554 1726853187.97533: worker is 1 (out of 1 available) 7554 1726853187.97549: exiting _queue_task() for managed_node3/assert 7554 1726853187.97562: done queuing things up, now waiting for results queue to drain 7554 1726853187.97563: waiting for pending results... 7554 1726853187.97990: running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is absent 7554 1726853187.98001: in run() - task 02083763-bbaf-bdc3-98b6-000000000100 7554 1726853187.98005: variable 'ansible_search_path' from source: unknown 7554 1726853187.98037: calling self._execute() 7554 1726853187.98135: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.98146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.98152: variable 'omit' from source: magic vars 7554 1726853187.98436: variable 'ansible_distribution_major_version' from source: facts 7554 1726853187.98450: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853187.98456: variable 'omit' from source: magic vars 7554 1726853187.98475: variable 'omit' from source: magic vars 7554 1726853187.98503: variable 'omit' from source: magic vars 7554 1726853187.98537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853187.98567: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853187.98587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853187.98600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853187.98610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853187.98635: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853187.98638: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.98641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.98714: Set connection var ansible_shell_executable to /bin/sh 7554 1726853187.98721: Set connection var ansible_pipelining to False 7554 1726853187.98724: Set connection var ansible_shell_type to sh 7554 1726853187.98726: Set connection var ansible_connection to ssh 7554 1726853187.98736: Set connection var ansible_timeout to 10 7554 1726853187.98738: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853187.98759: variable 'ansible_shell_executable' from source: unknown 7554 1726853187.98762: variable 'ansible_connection' from source: unknown 7554 1726853187.98765: variable 'ansible_module_compression' from source: unknown 7554 1726853187.98768: variable 'ansible_shell_type' from source: unknown 7554 1726853187.98770: variable 'ansible_shell_executable' from source: unknown 7554 1726853187.98774: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853187.98776: variable 'ansible_pipelining' from source: unknown 7554 1726853187.98778: variable 'ansible_timeout' from source: unknown 7554 1726853187.98782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853187.98886: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853187.98896: variable 'omit' from source: magic vars 7554 1726853187.98900: starting attempt loop 7554 1726853187.98903: running the handler 7554 1726853187.99007: variable '__test_str' from source: task vars 7554 1726853187.99073: variable 'interface' from source: play vars 7554 1726853187.99082: variable 'ipv4_routes' from source: set_fact 7554 1726853187.99094: Evaluated conditional (__test_str not in ipv4_routes.stdout): True 7554 1726853187.99099: handler run complete 7554 1726853187.99109: attempt loop complete, returning result 7554 1726853187.99112: _execute() done 7554 1726853187.99114: dumping result to json 7554 1726853187.99116: done dumping result, returning 7554 1726853187.99128: done running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is absent [02083763-bbaf-bdc3-98b6-000000000100] 7554 1726853187.99131: sending task result for task 02083763-bbaf-bdc3-98b6-000000000100 7554 1726853187.99213: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000100 7554 1726853187.99216: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7554 1726853187.99260: no more pending results, returning what we have 7554 1726853187.99264: results queue empty 7554 1726853187.99264: checking for any_errors_fatal 7554 1726853187.99275: done checking for any_errors_fatal 7554 1726853187.99276: checking for max_fail_percentage 7554 1726853187.99277: done checking for max_fail_percentage 7554 1726853187.99278: checking to see if all hosts have failed and the running result is not ok 7554 1726853187.99279: done checking to see if all hosts have failed 7554 1726853187.99280: getting the remaining hosts for this loop 7554 1726853187.99281: done getting the remaining hosts for this loop 7554 1726853187.99284: getting the next task for host managed_node3 7554 1726853187.99289: done getting next task for host managed_node3 7554 1726853187.99292: ^ task is: TASK: Get ipv6 routes 7554 1726853187.99294: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853187.99297: getting variables 7554 1726853187.99299: in VariableManager get_vars() 7554 1726853187.99340: Calling all_inventory to load vars for managed_node3 7554 1726853187.99343: Calling groups_inventory to load vars for managed_node3 7554 1726853187.99345: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853187.99354: Calling all_plugins_play to load vars for managed_node3 7554 1726853187.99357: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853187.99359: Calling groups_plugins_play to load vars for managed_node3 7554 1726853188.00618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853188.02143: done with get_vars() 7554 1726853188.02173: done getting variables 7554 1726853188.02233: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:123 Friday 20 September 2024 13:26:28 -0400 (0:00:00.050) 0:00:41.990 ****** 7554 1726853188.02261: entering _queue_task() for managed_node3/command 7554 1726853188.02600: worker is 1 (out of 1 available) 7554 1726853188.02612: exiting _queue_task() for managed_node3/command 7554 1726853188.02625: done queuing things up, now waiting for results queue to drain 7554 1726853188.02626: waiting for pending results... 7554 1726853188.02999: running TaskExecutor() for managed_node3/TASK: Get ipv6 routes 7554 1726853188.03019: in run() - task 02083763-bbaf-bdc3-98b6-000000000101 7554 1726853188.03041: variable 'ansible_search_path' from source: unknown 7554 1726853188.03084: calling self._execute() 7554 1726853188.03201: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.03205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.03277: variable 'omit' from source: magic vars 7554 1726853188.03605: variable 'ansible_distribution_major_version' from source: facts 7554 1726853188.03623: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853188.03638: variable 'omit' from source: magic vars 7554 1726853188.03660: variable 'omit' from source: magic vars 7554 1726853188.03702: variable 'omit' from source: magic vars 7554 1726853188.03750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853188.03795: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853188.03821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853188.03848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853188.04076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853188.04078: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853188.04080: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.04082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.04083: Set connection var ansible_shell_executable to /bin/sh 7554 1726853188.04085: Set connection var ansible_pipelining to False 7554 1726853188.04087: Set connection var ansible_shell_type to sh 7554 1726853188.04089: Set connection var ansible_connection to ssh 7554 1726853188.04090: Set connection var ansible_timeout to 10 7554 1726853188.04092: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853188.04093: variable 'ansible_shell_executable' from source: unknown 7554 1726853188.04095: variable 'ansible_connection' from source: unknown 7554 1726853188.04097: variable 'ansible_module_compression' from source: unknown 7554 1726853188.04099: variable 'ansible_shell_type' from source: unknown 7554 1726853188.04101: variable 'ansible_shell_executable' from source: unknown 7554 1726853188.04102: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.04103: variable 'ansible_pipelining' from source: unknown 7554 1726853188.04108: variable 'ansible_timeout' from source: unknown 7554 1726853188.04115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.04246: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853188.04262: variable 'omit' from source: magic vars 7554 1726853188.04273: starting attempt loop 7554 1726853188.04279: running the handler 7554 1726853188.04299: _low_level_execute_command(): starting 7554 1726853188.04313: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853188.05100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853188.05187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853188.05235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853188.05255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853188.05281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853188.05392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853188.07110: stdout chunk (state=3): >>>/root <<< 7554 1726853188.07251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853188.07269: stdout chunk (state=3): >>><<< 7554 1726853188.07307: stderr chunk (state=3): >>><<< 7554 1726853188.07336: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853188.07356: _low_level_execute_command(): starting 7554 1726853188.07384: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853188.07343-9130-158873451686583 `" && echo ansible-tmp-1726853188.07343-9130-158873451686583="` echo /root/.ansible/tmp/ansible-tmp-1726853188.07343-9130-158873451686583 `" ) && sleep 0' 7554 1726853188.08096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853188.08111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853188.08133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853188.08181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853188.08250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853188.08647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853188.08650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853188.08652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853188.08722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853188.10757: stdout chunk (state=3): >>>ansible-tmp-1726853188.07343-9130-158873451686583=/root/.ansible/tmp/ansible-tmp-1726853188.07343-9130-158873451686583 <<< 7554 1726853188.10907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853188.10918: stderr chunk (state=3): >>><<< 7554 1726853188.10935: stdout chunk (state=3): >>><<< 7554 1726853188.11076: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853188.07343-9130-158873451686583=/root/.ansible/tmp/ansible-tmp-1726853188.07343-9130-158873451686583 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853188.11080: variable 'ansible_module_compression' from source: unknown 7554 1726853188.11085: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853188.11103: variable 'ansible_facts' from source: unknown 7554 1726853188.11191: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853188.07343-9130-158873451686583/AnsiballZ_command.py 7554 1726853188.11442: Sending initial data 7554 1726853188.11445: Sent initial data (152 bytes) 7554 1726853188.11999: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853188.12015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853188.12032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853188.12052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853188.12097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853188.12110: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853188.12187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853188.12210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853188.12228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853188.12319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853188.14029: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853188.14107: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853188.14166: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpcwh544su /root/.ansible/tmp/ansible-tmp-1726853188.07343-9130-158873451686583/AnsiballZ_command.py <<< 7554 1726853188.14170: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853188.07343-9130-158873451686583/AnsiballZ_command.py" <<< 7554 1726853188.14249: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpcwh544su" to remote "/root/.ansible/tmp/ansible-tmp-1726853188.07343-9130-158873451686583/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853188.07343-9130-158873451686583/AnsiballZ_command.py" <<< 7554 1726853188.15235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853188.15238: stdout chunk (state=3): >>><<< 7554 1726853188.15249: stderr chunk (state=3): >>><<< 7554 1726853188.15346: done transferring module to remote 7554 1726853188.15350: _low_level_execute_command(): starting 7554 1726853188.15353: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853188.07343-9130-158873451686583/ /root/.ansible/tmp/ansible-tmp-1726853188.07343-9130-158873451686583/AnsiballZ_command.py && sleep 0' 7554 1726853188.16088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853188.16136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853188.16153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853188.16169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853188.16260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853188.18406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853188.18521: stderr chunk (state=3): >>><<< 7554 1726853188.18525: stdout chunk (state=3): >>><<< 7554 1726853188.18528: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853188.18531: _low_level_execute_command(): starting 7554 1726853188.18534: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853188.07343-9130-158873451686583/AnsiballZ_command.py && sleep 0' 7554 1726853188.19534: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853188.19550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853188.19787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853188.19992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853188.20095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853188.36430: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 13:26:28.358993", "end": "2024-09-20 13:26:28.362851", "delta": "0:00:00.003858", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853188.38117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853188.38129: stdout chunk (state=3): >>><<< 7554 1726853188.38143: stderr chunk (state=3): >>><<< 7554 1726853188.38169: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 13:26:28.358993", "end": "2024-09-20 13:26:28.362851", "delta": "0:00:00.003858", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853188.38214: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853188.07343-9130-158873451686583/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853188.38458: _low_level_execute_command(): starting 7554 1726853188.38462: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853188.07343-9130-158873451686583/ > /dev/null 2>&1 && sleep 0' 7554 1726853188.39388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853188.39488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853188.39503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853188.39686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853188.39719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853188.39734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853188.39755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853188.40059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853188.42034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853188.42038: stdout chunk (state=3): >>><<< 7554 1726853188.42040: stderr chunk (state=3): >>><<< 7554 1726853188.42175: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853188.42179: handler run complete 7554 1726853188.42181: Evaluated conditional (False): False 7554 1726853188.42184: attempt loop complete, returning result 7554 1726853188.42186: _execute() done 7554 1726853188.42188: dumping result to json 7554 1726853188.42189: done dumping result, returning 7554 1726853188.42191: done running TaskExecutor() for managed_node3/TASK: Get ipv6 routes [02083763-bbaf-bdc3-98b6-000000000101] 7554 1726853188.42193: sending task result for task 02083763-bbaf-bdc3-98b6-000000000101 7554 1726853188.42273: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000101 ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003858", "end": "2024-09-20 13:26:28.362851", "rc": 0, "start": "2024-09-20 13:26:28.358993" } STDOUT: 2001:db8::/64 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev peerveth0 proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium 7554 1726853188.42353: no more pending results, returning what we have 7554 1726853188.42357: results queue empty 7554 1726853188.42358: checking for any_errors_fatal 7554 1726853188.42365: done checking for any_errors_fatal 7554 1726853188.42366: checking for max_fail_percentage 7554 1726853188.42367: done checking for max_fail_percentage 7554 1726853188.42368: checking to see if all hosts have failed and the running result is not ok 7554 1726853188.42370: done checking to see if all hosts have failed 7554 1726853188.42372: getting the remaining hosts for this loop 7554 1726853188.42374: done getting the remaining hosts for this loop 7554 1726853188.42378: getting the next task for host managed_node3 7554 1726853188.42383: done getting next task for host managed_node3 7554 1726853188.42386: ^ task is: TASK: Assert default ipv6 route is absent 7554 1726853188.42388: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853188.42392: getting variables 7554 1726853188.42393: in VariableManager get_vars() 7554 1726853188.42440: Calling all_inventory to load vars for managed_node3 7554 1726853188.42442: Calling groups_inventory to load vars for managed_node3 7554 1726853188.42444: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853188.42457: Calling all_plugins_play to load vars for managed_node3 7554 1726853188.42460: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853188.42464: Calling groups_plugins_play to load vars for managed_node3 7554 1726853188.43104: WORKER PROCESS EXITING 7554 1726853188.44711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853188.46357: done with get_vars() 7554 1726853188.46387: done getting variables 7554 1726853188.46447: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is absent] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:127 Friday 20 September 2024 13:26:28 -0400 (0:00:00.442) 0:00:42.432 ****** 7554 1726853188.46482: entering _queue_task() for managed_node3/assert 7554 1726853188.46844: worker is 1 (out of 1 available) 7554 1726853188.46858: exiting _queue_task() for managed_node3/assert 7554 1726853188.46978: done queuing things up, now waiting for results queue to drain 7554 1726853188.46981: waiting for pending results... 7554 1726853188.47183: running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is absent 7554 1726853188.47377: in run() - task 02083763-bbaf-bdc3-98b6-000000000102 7554 1726853188.47381: variable 'ansible_search_path' from source: unknown 7554 1726853188.47385: calling self._execute() 7554 1726853188.47438: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.47441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.47452: variable 'omit' from source: magic vars 7554 1726853188.47848: variable 'ansible_distribution_major_version' from source: facts 7554 1726853188.47866: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853188.47984: variable 'network_provider' from source: set_fact 7554 1726853188.47989: Evaluated conditional (network_provider == "nm"): True 7554 1726853188.48277: variable 'omit' from source: magic vars 7554 1726853188.48280: variable 'omit' from source: magic vars 7554 1726853188.48286: variable 'omit' from source: magic vars 7554 1726853188.48289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853188.48292: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853188.48294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853188.48296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853188.48298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853188.48300: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853188.48302: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.48304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.48336: Set connection var ansible_shell_executable to /bin/sh 7554 1726853188.48347: Set connection var ansible_pipelining to False 7554 1726853188.48350: Set connection var ansible_shell_type to sh 7554 1726853188.48352: Set connection var ansible_connection to ssh 7554 1726853188.48360: Set connection var ansible_timeout to 10 7554 1726853188.48365: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853188.48388: variable 'ansible_shell_executable' from source: unknown 7554 1726853188.48392: variable 'ansible_connection' from source: unknown 7554 1726853188.48401: variable 'ansible_module_compression' from source: unknown 7554 1726853188.48404: variable 'ansible_shell_type' from source: unknown 7554 1726853188.48406: variable 'ansible_shell_executable' from source: unknown 7554 1726853188.48408: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.48414: variable 'ansible_pipelining' from source: unknown 7554 1726853188.48416: variable 'ansible_timeout' from source: unknown 7554 1726853188.48420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.48557: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853188.48776: variable 'omit' from source: magic vars 7554 1726853188.48780: starting attempt loop 7554 1726853188.48782: running the handler 7554 1726853188.48784: variable '__test_str' from source: task vars 7554 1726853188.48793: variable 'interface' from source: play vars 7554 1726853188.48803: variable 'ipv6_route' from source: set_fact 7554 1726853188.48814: Evaluated conditional (__test_str not in ipv6_route.stdout): True 7554 1726853188.48820: handler run complete 7554 1726853188.48840: attempt loop complete, returning result 7554 1726853188.48846: _execute() done 7554 1726853188.48849: dumping result to json 7554 1726853188.48851: done dumping result, returning 7554 1726853188.48854: done running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is absent [02083763-bbaf-bdc3-98b6-000000000102] 7554 1726853188.48860: sending task result for task 02083763-bbaf-bdc3-98b6-000000000102 7554 1726853188.48952: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000102 7554 1726853188.48955: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7554 1726853188.49005: no more pending results, returning what we have 7554 1726853188.49008: results queue empty 7554 1726853188.49009: checking for any_errors_fatal 7554 1726853188.49018: done checking for any_errors_fatal 7554 1726853188.49020: checking for max_fail_percentage 7554 1726853188.49022: done checking for max_fail_percentage 7554 1726853188.49023: checking to see if all hosts have failed and the running result is not ok 7554 1726853188.49025: done checking to see if all hosts have failed 7554 1726853188.49025: getting the remaining hosts for this loop 7554 1726853188.49027: done getting the remaining hosts for this loop 7554 1726853188.49030: getting the next task for host managed_node3 7554 1726853188.49035: done getting next task for host managed_node3 7554 1726853188.49038: ^ task is: TASK: TEARDOWN: remove profiles. 7554 1726853188.49040: ^ state is: HOST STATE: block=2, task=36, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853188.49159: getting variables 7554 1726853188.49162: in VariableManager get_vars() 7554 1726853188.49216: Calling all_inventory to load vars for managed_node3 7554 1726853188.49219: Calling groups_inventory to load vars for managed_node3 7554 1726853188.49222: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853188.49231: Calling all_plugins_play to load vars for managed_node3 7554 1726853188.49234: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853188.49237: Calling groups_plugins_play to load vars for managed_node3 7554 1726853188.50741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853188.53716: done with get_vars() 7554 1726853188.53745: done getting variables 7554 1726853188.53984: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:133 Friday 20 September 2024 13:26:28 -0400 (0:00:00.075) 0:00:42.507 ****** 7554 1726853188.54015: entering _queue_task() for managed_node3/debug 7554 1726853188.54774: worker is 1 (out of 1 available) 7554 1726853188.54788: exiting _queue_task() for managed_node3/debug 7554 1726853188.54800: done queuing things up, now waiting for results queue to drain 7554 1726853188.54802: waiting for pending results... 7554 1726853188.55194: running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. 7554 1726853188.55331: in run() - task 02083763-bbaf-bdc3-98b6-000000000103 7554 1726853188.55384: variable 'ansible_search_path' from source: unknown 7554 1726853188.55529: calling self._execute() 7554 1726853188.55540: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.55553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.55565: variable 'omit' from source: magic vars 7554 1726853188.55938: variable 'ansible_distribution_major_version' from source: facts 7554 1726853188.55961: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853188.55973: variable 'omit' from source: magic vars 7554 1726853188.55994: variable 'omit' from source: magic vars 7554 1726853188.56028: variable 'omit' from source: magic vars 7554 1726853188.56080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853188.56124: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853188.56154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853188.56184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853188.56201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853188.56237: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853188.56247: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.56255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.56365: Set connection var ansible_shell_executable to /bin/sh 7554 1726853188.56382: Set connection var ansible_pipelining to False 7554 1726853188.56395: Set connection var ansible_shell_type to sh 7554 1726853188.56402: Set connection var ansible_connection to ssh 7554 1726853188.56504: Set connection var ansible_timeout to 10 7554 1726853188.56507: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853188.56510: variable 'ansible_shell_executable' from source: unknown 7554 1726853188.56512: variable 'ansible_connection' from source: unknown 7554 1726853188.56514: variable 'ansible_module_compression' from source: unknown 7554 1726853188.56517: variable 'ansible_shell_type' from source: unknown 7554 1726853188.56519: variable 'ansible_shell_executable' from source: unknown 7554 1726853188.56521: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.56523: variable 'ansible_pipelining' from source: unknown 7554 1726853188.56525: variable 'ansible_timeout' from source: unknown 7554 1726853188.56526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.56649: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853188.56667: variable 'omit' from source: magic vars 7554 1726853188.56680: starting attempt loop 7554 1726853188.56688: running the handler 7554 1726853188.56748: handler run complete 7554 1726853188.56773: attempt loop complete, returning result 7554 1726853188.56783: _execute() done 7554 1726853188.56791: dumping result to json 7554 1726853188.56797: done dumping result, returning 7554 1726853188.56808: done running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. [02083763-bbaf-bdc3-98b6-000000000103] 7554 1726853188.56820: sending task result for task 02083763-bbaf-bdc3-98b6-000000000103 7554 1726853188.57046: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000103 7554 1726853188.57051: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 7554 1726853188.57107: no more pending results, returning what we have 7554 1726853188.57110: results queue empty 7554 1726853188.57111: checking for any_errors_fatal 7554 1726853188.57118: done checking for any_errors_fatal 7554 1726853188.57119: checking for max_fail_percentage 7554 1726853188.57121: done checking for max_fail_percentage 7554 1726853188.57122: checking to see if all hosts have failed and the running result is not ok 7554 1726853188.57123: done checking to see if all hosts have failed 7554 1726853188.57123: getting the remaining hosts for this loop 7554 1726853188.57124: done getting the remaining hosts for this loop 7554 1726853188.57128: getting the next task for host managed_node3 7554 1726853188.57135: done getting next task for host managed_node3 7554 1726853188.57140: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7554 1726853188.57143: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853188.57161: getting variables 7554 1726853188.57162: in VariableManager get_vars() 7554 1726853188.57204: Calling all_inventory to load vars for managed_node3 7554 1726853188.57207: Calling groups_inventory to load vars for managed_node3 7554 1726853188.57209: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853188.57333: Calling all_plugins_play to load vars for managed_node3 7554 1726853188.57337: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853188.57340: Calling groups_plugins_play to load vars for managed_node3 7554 1726853188.59114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853188.60833: done with get_vars() 7554 1726853188.60852: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:26:28 -0400 (0:00:00.069) 0:00:42.577 ****** 7554 1726853188.60951: entering _queue_task() for managed_node3/include_tasks 7554 1726853188.61544: worker is 1 (out of 1 available) 7554 1726853188.61558: exiting _queue_task() for managed_node3/include_tasks 7554 1726853188.61574: done queuing things up, now waiting for results queue to drain 7554 1726853188.61576: waiting for pending results... 7554 1726853188.61988: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7554 1726853188.62531: in run() - task 02083763-bbaf-bdc3-98b6-00000000010b 7554 1726853188.62536: variable 'ansible_search_path' from source: unknown 7554 1726853188.62540: variable 'ansible_search_path' from source: unknown 7554 1726853188.62545: calling self._execute() 7554 1726853188.62787: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.62791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.62793: variable 'omit' from source: magic vars 7554 1726853188.63213: variable 'ansible_distribution_major_version' from source: facts 7554 1726853188.63235: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853188.63248: _execute() done 7554 1726853188.63255: dumping result to json 7554 1726853188.63330: done dumping result, returning 7554 1726853188.63334: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-bdc3-98b6-00000000010b] 7554 1726853188.63336: sending task result for task 02083763-bbaf-bdc3-98b6-00000000010b 7554 1726853188.63407: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000010b 7554 1726853188.63409: WORKER PROCESS EXITING 7554 1726853188.63468: no more pending results, returning what we have 7554 1726853188.63474: in VariableManager get_vars() 7554 1726853188.63525: Calling all_inventory to load vars for managed_node3 7554 1726853188.63528: Calling groups_inventory to load vars for managed_node3 7554 1726853188.63530: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853188.63541: Calling all_plugins_play to load vars for managed_node3 7554 1726853188.63544: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853188.63547: Calling groups_plugins_play to load vars for managed_node3 7554 1726853188.64827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853188.66945: done with get_vars() 7554 1726853188.66969: variable 'ansible_search_path' from source: unknown 7554 1726853188.66973: variable 'ansible_search_path' from source: unknown 7554 1726853188.67013: we have included files to process 7554 1726853188.67014: generating all_blocks data 7554 1726853188.67016: done generating all_blocks data 7554 1726853188.67023: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7554 1726853188.67024: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7554 1726853188.67027: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7554 1726853188.67914: done processing included file 7554 1726853188.67916: iterating over new_blocks loaded from include file 7554 1726853188.67918: in VariableManager get_vars() 7554 1726853188.67949: done with get_vars() 7554 1726853188.67951: filtering new block on tags 7554 1726853188.67969: done filtering new block on tags 7554 1726853188.67974: in VariableManager get_vars() 7554 1726853188.68003: done with get_vars() 7554 1726853188.68005: filtering new block on tags 7554 1726853188.68025: done filtering new block on tags 7554 1726853188.68028: in VariableManager get_vars() 7554 1726853188.68054: done with get_vars() 7554 1726853188.68056: filtering new block on tags 7554 1726853188.68078: done filtering new block on tags 7554 1726853188.68080: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7554 1726853188.68087: extending task lists for all hosts with included blocks 7554 1726853188.68853: done extending task lists 7554 1726853188.68855: done processing included files 7554 1726853188.68856: results queue empty 7554 1726853188.68856: checking for any_errors_fatal 7554 1726853188.68859: done checking for any_errors_fatal 7554 1726853188.68860: checking for max_fail_percentage 7554 1726853188.68861: done checking for max_fail_percentage 7554 1726853188.68862: checking to see if all hosts have failed and the running result is not ok 7554 1726853188.68863: done checking to see if all hosts have failed 7554 1726853188.68864: getting the remaining hosts for this loop 7554 1726853188.68865: done getting the remaining hosts for this loop 7554 1726853188.68867: getting the next task for host managed_node3 7554 1726853188.68872: done getting next task for host managed_node3 7554 1726853188.68874: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7554 1726853188.68877: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853188.68888: getting variables 7554 1726853188.68889: in VariableManager get_vars() 7554 1726853188.68906: Calling all_inventory to load vars for managed_node3 7554 1726853188.68908: Calling groups_inventory to load vars for managed_node3 7554 1726853188.68909: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853188.68914: Calling all_plugins_play to load vars for managed_node3 7554 1726853188.68916: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853188.68918: Calling groups_plugins_play to load vars for managed_node3 7554 1726853188.70321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853188.71939: done with get_vars() 7554 1726853188.71959: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:26:28 -0400 (0:00:00.110) 0:00:42.687 ****** 7554 1726853188.72042: entering _queue_task() for managed_node3/setup 7554 1726853188.72397: worker is 1 (out of 1 available) 7554 1726853188.72411: exiting _queue_task() for managed_node3/setup 7554 1726853188.72536: done queuing things up, now waiting for results queue to drain 7554 1726853188.72538: waiting for pending results... 7554 1726853188.72734: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7554 1726853188.72970: in run() - task 02083763-bbaf-bdc3-98b6-0000000019b6 7554 1726853188.72976: variable 'ansible_search_path' from source: unknown 7554 1726853188.72980: variable 'ansible_search_path' from source: unknown 7554 1726853188.72984: calling self._execute() 7554 1726853188.73054: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.73061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.73072: variable 'omit' from source: magic vars 7554 1726853188.73500: variable 'ansible_distribution_major_version' from source: facts 7554 1726853188.73516: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853188.73749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853188.75898: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853188.76005: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853188.76008: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853188.76047: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853188.76070: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853188.76153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853188.76223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853188.76227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853188.76254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853188.76267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853188.76316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853188.76348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853188.76439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853188.76445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853188.76452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853188.76594: variable '__network_required_facts' from source: role '' defaults 7554 1726853188.76605: variable 'ansible_facts' from source: unknown 7554 1726853188.77431: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7554 1726853188.77435: when evaluation is False, skipping this task 7554 1726853188.77438: _execute() done 7554 1726853188.77440: dumping result to json 7554 1726853188.77445: done dumping result, returning 7554 1726853188.77451: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-bdc3-98b6-0000000019b6] 7554 1726853188.77457: sending task result for task 02083763-bbaf-bdc3-98b6-0000000019b6 7554 1726853188.77633: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000019b6 7554 1726853188.77637: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853188.77691: no more pending results, returning what we have 7554 1726853188.77696: results queue empty 7554 1726853188.77697: checking for any_errors_fatal 7554 1726853188.77698: done checking for any_errors_fatal 7554 1726853188.77699: checking for max_fail_percentage 7554 1726853188.77701: done checking for max_fail_percentage 7554 1726853188.77702: checking to see if all hosts have failed and the running result is not ok 7554 1726853188.77703: done checking to see if all hosts have failed 7554 1726853188.77704: getting the remaining hosts for this loop 7554 1726853188.77706: done getting the remaining hosts for this loop 7554 1726853188.77710: getting the next task for host managed_node3 7554 1726853188.77720: done getting next task for host managed_node3 7554 1726853188.77725: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7554 1726853188.77729: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853188.77987: getting variables 7554 1726853188.77990: in VariableManager get_vars() 7554 1726853188.78039: Calling all_inventory to load vars for managed_node3 7554 1726853188.78041: Calling groups_inventory to load vars for managed_node3 7554 1726853188.78043: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853188.78052: Calling all_plugins_play to load vars for managed_node3 7554 1726853188.78056: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853188.78059: Calling groups_plugins_play to load vars for managed_node3 7554 1726853188.79375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853188.81064: done with get_vars() 7554 1726853188.81092: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:26:28 -0400 (0:00:00.091) 0:00:42.779 ****** 7554 1726853188.81207: entering _queue_task() for managed_node3/stat 7554 1726853188.81778: worker is 1 (out of 1 available) 7554 1726853188.81787: exiting _queue_task() for managed_node3/stat 7554 1726853188.81798: done queuing things up, now waiting for results queue to drain 7554 1726853188.81799: waiting for pending results... 7554 1726853188.82089: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7554 1726853188.82096: in run() - task 02083763-bbaf-bdc3-98b6-0000000019b8 7554 1726853188.82099: variable 'ansible_search_path' from source: unknown 7554 1726853188.82102: variable 'ansible_search_path' from source: unknown 7554 1726853188.82104: calling self._execute() 7554 1726853188.82175: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.82181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.82193: variable 'omit' from source: magic vars 7554 1726853188.82576: variable 'ansible_distribution_major_version' from source: facts 7554 1726853188.82589: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853188.82737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853188.82996: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853188.83045: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853188.83075: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853188.83110: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853188.83193: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853188.83216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853188.83249: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853188.83273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853188.83355: variable '__network_is_ostree' from source: set_fact 7554 1726853188.83361: Evaluated conditional (not __network_is_ostree is defined): False 7554 1726853188.83364: when evaluation is False, skipping this task 7554 1726853188.83367: _execute() done 7554 1726853188.83372: dumping result to json 7554 1726853188.83375: done dumping result, returning 7554 1726853188.83383: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-bdc3-98b6-0000000019b8] 7554 1726853188.83389: sending task result for task 02083763-bbaf-bdc3-98b6-0000000019b8 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7554 1726853188.83529: no more pending results, returning what we have 7554 1726853188.83533: results queue empty 7554 1726853188.83534: checking for any_errors_fatal 7554 1726853188.83547: done checking for any_errors_fatal 7554 1726853188.83548: checking for max_fail_percentage 7554 1726853188.83550: done checking for max_fail_percentage 7554 1726853188.83551: checking to see if all hosts have failed and the running result is not ok 7554 1726853188.83552: done checking to see if all hosts have failed 7554 1726853188.83553: getting the remaining hosts for this loop 7554 1726853188.83555: done getting the remaining hosts for this loop 7554 1726853188.83559: getting the next task for host managed_node3 7554 1726853188.83565: done getting next task for host managed_node3 7554 1726853188.83569: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7554 1726853188.83574: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853188.83596: getting variables 7554 1726853188.83598: in VariableManager get_vars() 7554 1726853188.83646: Calling all_inventory to load vars for managed_node3 7554 1726853188.83649: Calling groups_inventory to load vars for managed_node3 7554 1726853188.83767: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853188.83775: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000019b8 7554 1726853188.83778: WORKER PROCESS EXITING 7554 1726853188.83787: Calling all_plugins_play to load vars for managed_node3 7554 1726853188.83790: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853188.83793: Calling groups_plugins_play to load vars for managed_node3 7554 1726853188.85466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853188.87067: done with get_vars() 7554 1726853188.87088: done getting variables 7554 1726853188.87146: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:26:28 -0400 (0:00:00.059) 0:00:42.839 ****** 7554 1726853188.87187: entering _queue_task() for managed_node3/set_fact 7554 1726853188.87520: worker is 1 (out of 1 available) 7554 1726853188.87534: exiting _queue_task() for managed_node3/set_fact 7554 1726853188.87662: done queuing things up, now waiting for results queue to drain 7554 1726853188.87664: waiting for pending results... 7554 1726853188.87878: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7554 1726853188.88042: in run() - task 02083763-bbaf-bdc3-98b6-0000000019b9 7554 1726853188.88057: variable 'ansible_search_path' from source: unknown 7554 1726853188.88061: variable 'ansible_search_path' from source: unknown 7554 1726853188.88105: calling self._execute() 7554 1726853188.88213: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.88222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.88233: variable 'omit' from source: magic vars 7554 1726853188.88616: variable 'ansible_distribution_major_version' from source: facts 7554 1726853188.88629: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853188.88809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853188.89095: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853188.89145: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853188.89178: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853188.89218: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853188.89304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853188.89376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853188.89380: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853188.89383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853188.89482: variable '__network_is_ostree' from source: set_fact 7554 1726853188.89489: Evaluated conditional (not __network_is_ostree is defined): False 7554 1726853188.89492: when evaluation is False, skipping this task 7554 1726853188.89495: _execute() done 7554 1726853188.89497: dumping result to json 7554 1726853188.89499: done dumping result, returning 7554 1726853188.89512: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-bdc3-98b6-0000000019b9] 7554 1726853188.89518: sending task result for task 02083763-bbaf-bdc3-98b6-0000000019b9 7554 1726853188.89678: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000019b9 7554 1726853188.89682: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7554 1726853188.89772: no more pending results, returning what we have 7554 1726853188.89775: results queue empty 7554 1726853188.89776: checking for any_errors_fatal 7554 1726853188.89782: done checking for any_errors_fatal 7554 1726853188.89783: checking for max_fail_percentage 7554 1726853188.89784: done checking for max_fail_percentage 7554 1726853188.89785: checking to see if all hosts have failed and the running result is not ok 7554 1726853188.89786: done checking to see if all hosts have failed 7554 1726853188.89787: getting the remaining hosts for this loop 7554 1726853188.89788: done getting the remaining hosts for this loop 7554 1726853188.89792: getting the next task for host managed_node3 7554 1726853188.89801: done getting next task for host managed_node3 7554 1726853188.89805: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7554 1726853188.89809: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853188.89830: getting variables 7554 1726853188.89953: in VariableManager get_vars() 7554 1726853188.89998: Calling all_inventory to load vars for managed_node3 7554 1726853188.90001: Calling groups_inventory to load vars for managed_node3 7554 1726853188.90004: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853188.90012: Calling all_plugins_play to load vars for managed_node3 7554 1726853188.90015: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853188.90018: Calling groups_plugins_play to load vars for managed_node3 7554 1726853188.91255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853188.92939: done with get_vars() 7554 1726853188.92959: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:26:28 -0400 (0:00:00.058) 0:00:42.898 ****** 7554 1726853188.93058: entering _queue_task() for managed_node3/service_facts 7554 1726853188.93581: worker is 1 (out of 1 available) 7554 1726853188.93590: exiting _queue_task() for managed_node3/service_facts 7554 1726853188.93601: done queuing things up, now waiting for results queue to drain 7554 1726853188.93602: waiting for pending results... 7554 1726853188.93691: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7554 1726853188.94178: in run() - task 02083763-bbaf-bdc3-98b6-0000000019bb 7554 1726853188.94182: variable 'ansible_search_path' from source: unknown 7554 1726853188.94185: variable 'ansible_search_path' from source: unknown 7554 1726853188.94187: calling self._execute() 7554 1726853188.94190: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.94193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.94196: variable 'omit' from source: magic vars 7554 1726853188.94311: variable 'ansible_distribution_major_version' from source: facts 7554 1726853188.94479: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853188.94483: variable 'omit' from source: magic vars 7554 1726853188.94486: variable 'omit' from source: magic vars 7554 1726853188.94488: variable 'omit' from source: magic vars 7554 1726853188.94490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853188.94512: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853188.94531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853188.94547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853188.94558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853188.94599: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853188.94603: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.94605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.94704: Set connection var ansible_shell_executable to /bin/sh 7554 1726853188.94731: Set connection var ansible_pipelining to False 7554 1726853188.94735: Set connection var ansible_shell_type to sh 7554 1726853188.94737: Set connection var ansible_connection to ssh 7554 1726853188.94739: Set connection var ansible_timeout to 10 7554 1726853188.94741: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853188.94746: variable 'ansible_shell_executable' from source: unknown 7554 1726853188.94749: variable 'ansible_connection' from source: unknown 7554 1726853188.94751: variable 'ansible_module_compression' from source: unknown 7554 1726853188.94754: variable 'ansible_shell_type' from source: unknown 7554 1726853188.94756: variable 'ansible_shell_executable' from source: unknown 7554 1726853188.94758: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853188.94763: variable 'ansible_pipelining' from source: unknown 7554 1726853188.94765: variable 'ansible_timeout' from source: unknown 7554 1726853188.94770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853188.94962: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853188.94974: variable 'omit' from source: magic vars 7554 1726853188.94980: starting attempt loop 7554 1726853188.94983: running the handler 7554 1726853188.94997: _low_level_execute_command(): starting 7554 1726853188.95003: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853188.95725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853188.95736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853188.95747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853188.95761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853188.95776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853188.95786: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853188.95886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853188.95892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853188.96000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853188.97714: stdout chunk (state=3): >>>/root <<< 7554 1726853188.97812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853188.97879: stderr chunk (state=3): >>><<< 7554 1726853188.97889: stdout chunk (state=3): >>><<< 7554 1726853188.97918: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853188.97937: _low_level_execute_command(): starting 7554 1726853188.97951: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853188.979251-9167-261673150642852 `" && echo ansible-tmp-1726853188.979251-9167-261673150642852="` echo /root/.ansible/tmp/ansible-tmp-1726853188.979251-9167-261673150642852 `" ) && sleep 0' 7554 1726853188.98577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853188.98580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853188.98585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853188.98675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853188.98687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853188.98690: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853188.98692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853188.98694: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853188.98697: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853188.98699: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7554 1726853188.98701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853188.98703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853188.98705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853188.98718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853188.98720: stderr chunk (state=3): >>>debug2: match found <<< 7554 1726853188.98722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853188.98781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853188.98793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853188.98803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853188.98903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853189.00912: stdout chunk (state=3): >>>ansible-tmp-1726853188.979251-9167-261673150642852=/root/.ansible/tmp/ansible-tmp-1726853188.979251-9167-261673150642852 <<< 7554 1726853189.01078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853189.01081: stdout chunk (state=3): >>><<< 7554 1726853189.01085: stderr chunk (state=3): >>><<< 7554 1726853189.01101: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853188.979251-9167-261673150642852=/root/.ansible/tmp/ansible-tmp-1726853188.979251-9167-261673150642852 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853189.01276: variable 'ansible_module_compression' from source: unknown 7554 1726853189.01279: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 7554 1726853189.01282: variable 'ansible_facts' from source: unknown 7554 1726853189.01337: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853188.979251-9167-261673150642852/AnsiballZ_service_facts.py 7554 1726853189.01530: Sending initial data 7554 1726853189.01533: Sent initial data (159 bytes) 7554 1726853189.02161: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853189.02181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853189.02290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853189.02309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853189.02327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853189.02353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853189.02457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853189.04127: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853189.04182: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853189.04240: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpijbdvayp /root/.ansible/tmp/ansible-tmp-1726853188.979251-9167-261673150642852/AnsiballZ_service_facts.py <<< 7554 1726853189.04249: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853188.979251-9167-261673150642852/AnsiballZ_service_facts.py" <<< 7554 1726853189.04294: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpijbdvayp" to remote "/root/.ansible/tmp/ansible-tmp-1726853188.979251-9167-261673150642852/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853188.979251-9167-261673150642852/AnsiballZ_service_facts.py" <<< 7554 1726853189.04947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853189.05006: stderr chunk (state=3): >>><<< 7554 1726853189.05102: stdout chunk (state=3): >>><<< 7554 1726853189.05105: done transferring module to remote 7554 1726853189.05108: _low_level_execute_command(): starting 7554 1726853189.05110: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853188.979251-9167-261673150642852/ /root/.ansible/tmp/ansible-tmp-1726853188.979251-9167-261673150642852/AnsiballZ_service_facts.py && sleep 0' 7554 1726853189.05651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853189.05676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853189.05721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853189.05724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853189.05780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853189.05783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853189.05795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853189.05861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853189.07755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853189.07777: stderr chunk (state=3): >>><<< 7554 1726853189.07781: stdout chunk (state=3): >>><<< 7554 1726853189.07785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853189.07789: _low_level_execute_command(): starting 7554 1726853189.07792: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853188.979251-9167-261673150642852/AnsiballZ_service_facts.py && sleep 0' 7554 1726853189.08412: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853189.08415: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853189.08509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853189.08515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853189.08518: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853189.08520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853189.08522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853189.08562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853189.08639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853190.68461: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 7554 1726853190.68480: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 7554 1726853190.68519: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service<<< 7554 1726853190.68525: stdout chunk (state=3): >>>": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7554 1726853190.70161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853190.70190: stderr chunk (state=3): >>><<< 7554 1726853190.70195: stdout chunk (state=3): >>><<< 7554 1726853190.70217: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853190.70652: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853188.979251-9167-261673150642852/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853190.70660: _low_level_execute_command(): starting 7554 1726853190.70668: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853188.979251-9167-261673150642852/ > /dev/null 2>&1 && sleep 0' 7554 1726853190.71122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853190.71126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853190.71130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853190.71132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853190.71135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853190.71189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853190.71196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853190.71197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853190.71253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853190.73139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853190.73170: stderr chunk (state=3): >>><<< 7554 1726853190.73175: stdout chunk (state=3): >>><<< 7554 1726853190.73186: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853190.73192: handler run complete 7554 1726853190.73306: variable 'ansible_facts' from source: unknown 7554 1726853190.73402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853190.73677: variable 'ansible_facts' from source: unknown 7554 1726853190.73754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853190.73867: attempt loop complete, returning result 7554 1726853190.73870: _execute() done 7554 1726853190.73878: dumping result to json 7554 1726853190.73910: done dumping result, returning 7554 1726853190.73919: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-bdc3-98b6-0000000019bb] 7554 1726853190.73922: sending task result for task 02083763-bbaf-bdc3-98b6-0000000019bb ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853190.74526: no more pending results, returning what we have 7554 1726853190.74528: results queue empty 7554 1726853190.74529: checking for any_errors_fatal 7554 1726853190.74534: done checking for any_errors_fatal 7554 1726853190.74535: checking for max_fail_percentage 7554 1726853190.74536: done checking for max_fail_percentage 7554 1726853190.74537: checking to see if all hosts have failed and the running result is not ok 7554 1726853190.74538: done checking to see if all hosts have failed 7554 1726853190.74539: getting the remaining hosts for this loop 7554 1726853190.74540: done getting the remaining hosts for this loop 7554 1726853190.74545: getting the next task for host managed_node3 7554 1726853190.74551: done getting next task for host managed_node3 7554 1726853190.74554: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7554 1726853190.74558: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853190.74568: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000019bb 7554 1726853190.74572: WORKER PROCESS EXITING 7554 1726853190.74580: getting variables 7554 1726853190.74581: in VariableManager get_vars() 7554 1726853190.74609: Calling all_inventory to load vars for managed_node3 7554 1726853190.74610: Calling groups_inventory to load vars for managed_node3 7554 1726853190.74612: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853190.74618: Calling all_plugins_play to load vars for managed_node3 7554 1726853190.74620: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853190.74622: Calling groups_plugins_play to load vars for managed_node3 7554 1726853190.75407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853190.76262: done with get_vars() 7554 1726853190.76281: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:26:30 -0400 (0:00:01.832) 0:00:44.731 ****** 7554 1726853190.76356: entering _queue_task() for managed_node3/package_facts 7554 1726853190.76576: worker is 1 (out of 1 available) 7554 1726853190.76589: exiting _queue_task() for managed_node3/package_facts 7554 1726853190.76601: done queuing things up, now waiting for results queue to drain 7554 1726853190.76602: waiting for pending results... 7554 1726853190.76784: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7554 1726853190.76906: in run() - task 02083763-bbaf-bdc3-98b6-0000000019bc 7554 1726853190.76911: variable 'ansible_search_path' from source: unknown 7554 1726853190.76913: variable 'ansible_search_path' from source: unknown 7554 1726853190.76932: calling self._execute() 7554 1726853190.77007: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853190.77011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853190.77021: variable 'omit' from source: magic vars 7554 1726853190.77290: variable 'ansible_distribution_major_version' from source: facts 7554 1726853190.77299: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853190.77305: variable 'omit' from source: magic vars 7554 1726853190.77355: variable 'omit' from source: magic vars 7554 1726853190.77385: variable 'omit' from source: magic vars 7554 1726853190.77416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853190.77445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853190.77459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853190.77474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853190.77488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853190.77510: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853190.77513: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853190.77516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853190.77585: Set connection var ansible_shell_executable to /bin/sh 7554 1726853190.77592: Set connection var ansible_pipelining to False 7554 1726853190.77598: Set connection var ansible_shell_type to sh 7554 1726853190.77604: Set connection var ansible_connection to ssh 7554 1726853190.77612: Set connection var ansible_timeout to 10 7554 1726853190.77617: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853190.77634: variable 'ansible_shell_executable' from source: unknown 7554 1726853190.77636: variable 'ansible_connection' from source: unknown 7554 1726853190.77639: variable 'ansible_module_compression' from source: unknown 7554 1726853190.77641: variable 'ansible_shell_type' from source: unknown 7554 1726853190.77643: variable 'ansible_shell_executable' from source: unknown 7554 1726853190.77649: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853190.77652: variable 'ansible_pipelining' from source: unknown 7554 1726853190.77655: variable 'ansible_timeout' from source: unknown 7554 1726853190.77659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853190.77807: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853190.77819: variable 'omit' from source: magic vars 7554 1726853190.77822: starting attempt loop 7554 1726853190.77825: running the handler 7554 1726853190.77837: _low_level_execute_command(): starting 7554 1726853190.77843: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853190.78347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853190.78351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853190.78356: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853190.78358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853190.78413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853190.78416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853190.78418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853190.78490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853190.80187: stdout chunk (state=3): >>>/root <<< 7554 1726853190.80286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853190.80314: stderr chunk (state=3): >>><<< 7554 1726853190.80317: stdout chunk (state=3): >>><<< 7554 1726853190.80335: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853190.80347: _low_level_execute_command(): starting 7554 1726853190.80351: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853190.803349-9204-166799983212336 `" && echo ansible-tmp-1726853190.803349-9204-166799983212336="` echo /root/.ansible/tmp/ansible-tmp-1726853190.803349-9204-166799983212336 `" ) && sleep 0' 7554 1726853190.80783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853190.80786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853190.80788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853190.80797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853190.80799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853190.80846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853190.80854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853190.80912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853190.82888: stdout chunk (state=3): >>>ansible-tmp-1726853190.803349-9204-166799983212336=/root/.ansible/tmp/ansible-tmp-1726853190.803349-9204-166799983212336 <<< 7554 1726853190.82992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853190.83023: stderr chunk (state=3): >>><<< 7554 1726853190.83026: stdout chunk (state=3): >>><<< 7554 1726853190.83040: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853190.803349-9204-166799983212336=/root/.ansible/tmp/ansible-tmp-1726853190.803349-9204-166799983212336 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853190.83080: variable 'ansible_module_compression' from source: unknown 7554 1726853190.83118: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 7554 1726853190.83168: variable 'ansible_facts' from source: unknown 7554 1726853190.83288: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853190.803349-9204-166799983212336/AnsiballZ_package_facts.py 7554 1726853190.83389: Sending initial data 7554 1726853190.83393: Sent initial data (159 bytes) 7554 1726853190.84091: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853190.84146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853190.85765: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 7554 1726853190.85769: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853190.85820: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853190.85884: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpb1cywvy5 /root/.ansible/tmp/ansible-tmp-1726853190.803349-9204-166799983212336/AnsiballZ_package_facts.py <<< 7554 1726853190.85887: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853190.803349-9204-166799983212336/AnsiballZ_package_facts.py" <<< 7554 1726853190.85945: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpb1cywvy5" to remote "/root/.ansible/tmp/ansible-tmp-1726853190.803349-9204-166799983212336/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853190.803349-9204-166799983212336/AnsiballZ_package_facts.py" <<< 7554 1726853190.87277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853190.87280: stdout chunk (state=3): >>><<< 7554 1726853190.87282: stderr chunk (state=3): >>><<< 7554 1726853190.87284: done transferring module to remote 7554 1726853190.87286: _low_level_execute_command(): starting 7554 1726853190.87288: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853190.803349-9204-166799983212336/ /root/.ansible/tmp/ansible-tmp-1726853190.803349-9204-166799983212336/AnsiballZ_package_facts.py && sleep 0' 7554 1726853190.87926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853190.87939: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853190.87965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853190.87986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853190.88078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853190.88111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853190.88127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853190.88150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853190.88262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853190.90174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853190.90185: stdout chunk (state=3): >>><<< 7554 1726853190.90199: stderr chunk (state=3): >>><<< 7554 1726853190.90302: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853190.90309: _low_level_execute_command(): starting 7554 1726853190.90311: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853190.803349-9204-166799983212336/AnsiballZ_package_facts.py && sleep 0' 7554 1726853190.90993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853190.91004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853190.91093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853190.91129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853190.91152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853190.91176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853190.91280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853191.36132: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 7554 1726853191.36205: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 7554 1726853191.36276: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 7554 1726853191.36287: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 7554 1726853191.36357: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7554 1726853191.38192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853191.38268: stderr chunk (state=3): >>><<< 7554 1726853191.38274: stdout chunk (state=3): >>><<< 7554 1726853191.38290: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853191.40678: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853190.803349-9204-166799983212336/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853191.40682: _low_level_execute_command(): starting 7554 1726853191.40685: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853190.803349-9204-166799983212336/ > /dev/null 2>&1 && sleep 0' 7554 1726853191.41319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853191.41345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853191.41363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853191.41384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853191.41453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853191.41499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853191.41513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853191.41531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853191.41626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853191.43601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853191.43605: stdout chunk (state=3): >>><<< 7554 1726853191.43607: stderr chunk (state=3): >>><<< 7554 1726853191.43690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853191.43693: handler run complete 7554 1726853191.44666: variable 'ansible_facts' from source: unknown 7554 1726853191.45454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853191.47506: variable 'ansible_facts' from source: unknown 7554 1726853191.48009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853191.48679: attempt loop complete, returning result 7554 1726853191.48695: _execute() done 7554 1726853191.48703: dumping result to json 7554 1726853191.48922: done dumping result, returning 7554 1726853191.48938: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-bdc3-98b6-0000000019bc] 7554 1726853191.48989: sending task result for task 02083763-bbaf-bdc3-98b6-0000000019bc 7554 1726853191.51501: done sending task result for task 02083763-bbaf-bdc3-98b6-0000000019bc 7554 1726853191.51504: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853191.51661: no more pending results, returning what we have 7554 1726853191.51664: results queue empty 7554 1726853191.51665: checking for any_errors_fatal 7554 1726853191.51669: done checking for any_errors_fatal 7554 1726853191.51670: checking for max_fail_percentage 7554 1726853191.51674: done checking for max_fail_percentage 7554 1726853191.51675: checking to see if all hosts have failed and the running result is not ok 7554 1726853191.51676: done checking to see if all hosts have failed 7554 1726853191.51676: getting the remaining hosts for this loop 7554 1726853191.51678: done getting the remaining hosts for this loop 7554 1726853191.51681: getting the next task for host managed_node3 7554 1726853191.51687: done getting next task for host managed_node3 7554 1726853191.51690: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7554 1726853191.51693: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853191.51702: getting variables 7554 1726853191.51704: in VariableManager get_vars() 7554 1726853191.51739: Calling all_inventory to load vars for managed_node3 7554 1726853191.51742: Calling groups_inventory to load vars for managed_node3 7554 1726853191.51747: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853191.51756: Calling all_plugins_play to load vars for managed_node3 7554 1726853191.51758: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853191.51761: Calling groups_plugins_play to load vars for managed_node3 7554 1726853191.53014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853191.54945: done with get_vars() 7554 1726853191.54968: done getting variables 7554 1726853191.55031: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:26:31 -0400 (0:00:00.787) 0:00:45.518 ****** 7554 1726853191.55073: entering _queue_task() for managed_node3/debug 7554 1726853191.55591: worker is 1 (out of 1 available) 7554 1726853191.55601: exiting _queue_task() for managed_node3/debug 7554 1726853191.55612: done queuing things up, now waiting for results queue to drain 7554 1726853191.55613: waiting for pending results... 7554 1726853191.55741: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7554 1726853191.55852: in run() - task 02083763-bbaf-bdc3-98b6-00000000010c 7554 1726853191.55875: variable 'ansible_search_path' from source: unknown 7554 1726853191.55946: variable 'ansible_search_path' from source: unknown 7554 1726853191.55950: calling self._execute() 7554 1726853191.56019: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853191.56031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853191.56045: variable 'omit' from source: magic vars 7554 1726853191.56418: variable 'ansible_distribution_major_version' from source: facts 7554 1726853191.56435: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853191.56446: variable 'omit' from source: magic vars 7554 1726853191.56507: variable 'omit' from source: magic vars 7554 1726853191.56613: variable 'network_provider' from source: set_fact 7554 1726853191.56637: variable 'omit' from source: magic vars 7554 1726853191.56682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853191.56727: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853191.56810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853191.56814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853191.56816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853191.56827: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853191.56836: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853191.56843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853191.56951: Set connection var ansible_shell_executable to /bin/sh 7554 1726853191.56966: Set connection var ansible_pipelining to False 7554 1726853191.56977: Set connection var ansible_shell_type to sh 7554 1726853191.56984: Set connection var ansible_connection to ssh 7554 1726853191.56998: Set connection var ansible_timeout to 10 7554 1726853191.57005: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853191.57031: variable 'ansible_shell_executable' from source: unknown 7554 1726853191.57135: variable 'ansible_connection' from source: unknown 7554 1726853191.57137: variable 'ansible_module_compression' from source: unknown 7554 1726853191.57139: variable 'ansible_shell_type' from source: unknown 7554 1726853191.57141: variable 'ansible_shell_executable' from source: unknown 7554 1726853191.57142: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853191.57144: variable 'ansible_pipelining' from source: unknown 7554 1726853191.57145: variable 'ansible_timeout' from source: unknown 7554 1726853191.57147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853191.57198: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853191.57216: variable 'omit' from source: magic vars 7554 1726853191.57226: starting attempt loop 7554 1726853191.57233: running the handler 7554 1726853191.57296: handler run complete 7554 1726853191.57316: attempt loop complete, returning result 7554 1726853191.57323: _execute() done 7554 1726853191.57329: dumping result to json 7554 1726853191.57336: done dumping result, returning 7554 1726853191.57352: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-bdc3-98b6-00000000010c] 7554 1726853191.57364: sending task result for task 02083763-bbaf-bdc3-98b6-00000000010c 7554 1726853191.57587: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000010c 7554 1726853191.57593: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 7554 1726853191.57675: no more pending results, returning what we have 7554 1726853191.57680: results queue empty 7554 1726853191.57681: checking for any_errors_fatal 7554 1726853191.57689: done checking for any_errors_fatal 7554 1726853191.57690: checking for max_fail_percentage 7554 1726853191.57691: done checking for max_fail_percentage 7554 1726853191.57694: checking to see if all hosts have failed and the running result is not ok 7554 1726853191.57696: done checking to see if all hosts have failed 7554 1726853191.57697: getting the remaining hosts for this loop 7554 1726853191.57698: done getting the remaining hosts for this loop 7554 1726853191.57702: getting the next task for host managed_node3 7554 1726853191.57710: done getting next task for host managed_node3 7554 1726853191.57715: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7554 1726853191.57718: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853191.57736: getting variables 7554 1726853191.57739: in VariableManager get_vars() 7554 1726853191.58122: Calling all_inventory to load vars for managed_node3 7554 1726853191.58125: Calling groups_inventory to load vars for managed_node3 7554 1726853191.58127: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853191.58136: Calling all_plugins_play to load vars for managed_node3 7554 1726853191.58139: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853191.58143: Calling groups_plugins_play to load vars for managed_node3 7554 1726853191.59419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853191.62452: done with get_vars() 7554 1726853191.62480: done getting variables 7554 1726853191.62539: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:26:31 -0400 (0:00:00.075) 0:00:45.593 ****** 7554 1726853191.62778: entering _queue_task() for managed_node3/fail 7554 1726853191.63314: worker is 1 (out of 1 available) 7554 1726853191.63326: exiting _queue_task() for managed_node3/fail 7554 1726853191.63338: done queuing things up, now waiting for results queue to drain 7554 1726853191.63340: waiting for pending results... 7554 1726853191.63900: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7554 1726853191.64215: in run() - task 02083763-bbaf-bdc3-98b6-00000000010d 7554 1726853191.64221: variable 'ansible_search_path' from source: unknown 7554 1726853191.64225: variable 'ansible_search_path' from source: unknown 7554 1726853191.64260: calling self._execute() 7554 1726853191.64412: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853191.64428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853191.64677: variable 'omit' from source: magic vars 7554 1726853191.64831: variable 'ansible_distribution_major_version' from source: facts 7554 1726853191.64850: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853191.64980: variable 'network_state' from source: role '' defaults 7554 1726853191.64998: Evaluated conditional (network_state != {}): False 7554 1726853191.65010: when evaluation is False, skipping this task 7554 1726853191.65020: _execute() done 7554 1726853191.65027: dumping result to json 7554 1726853191.65035: done dumping result, returning 7554 1726853191.65046: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-bdc3-98b6-00000000010d] 7554 1726853191.65057: sending task result for task 02083763-bbaf-bdc3-98b6-00000000010d skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853191.65209: no more pending results, returning what we have 7554 1726853191.65213: results queue empty 7554 1726853191.65214: checking for any_errors_fatal 7554 1726853191.65223: done checking for any_errors_fatal 7554 1726853191.65223: checking for max_fail_percentage 7554 1726853191.65225: done checking for max_fail_percentage 7554 1726853191.65226: checking to see if all hosts have failed and the running result is not ok 7554 1726853191.65227: done checking to see if all hosts have failed 7554 1726853191.65228: getting the remaining hosts for this loop 7554 1726853191.65229: done getting the remaining hosts for this loop 7554 1726853191.65233: getting the next task for host managed_node3 7554 1726853191.65240: done getting next task for host managed_node3 7554 1726853191.65244: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7554 1726853191.65248: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853191.65276: getting variables 7554 1726853191.65278: in VariableManager get_vars() 7554 1726853191.65333: Calling all_inventory to load vars for managed_node3 7554 1726853191.65336: Calling groups_inventory to load vars for managed_node3 7554 1726853191.65338: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853191.65351: Calling all_plugins_play to load vars for managed_node3 7554 1726853191.65354: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853191.65357: Calling groups_plugins_play to load vars for managed_node3 7554 1726853191.65892: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000010d 7554 1726853191.65896: WORKER PROCESS EXITING 7554 1726853191.67220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853191.82226: done with get_vars() 7554 1726853191.82263: done getting variables 7554 1726853191.82315: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:26:31 -0400 (0:00:00.197) 0:00:45.791 ****** 7554 1726853191.82345: entering _queue_task() for managed_node3/fail 7554 1726853191.82696: worker is 1 (out of 1 available) 7554 1726853191.82710: exiting _queue_task() for managed_node3/fail 7554 1726853191.82721: done queuing things up, now waiting for results queue to drain 7554 1726853191.82724: waiting for pending results... 7554 1726853191.83028: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7554 1726853191.83191: in run() - task 02083763-bbaf-bdc3-98b6-00000000010e 7554 1726853191.83216: variable 'ansible_search_path' from source: unknown 7554 1726853191.83226: variable 'ansible_search_path' from source: unknown 7554 1726853191.83308: calling self._execute() 7554 1726853191.83388: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853191.83401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853191.83419: variable 'omit' from source: magic vars 7554 1726853191.83799: variable 'ansible_distribution_major_version' from source: facts 7554 1726853191.83816: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853191.83959: variable 'network_state' from source: role '' defaults 7554 1726853191.83962: Evaluated conditional (network_state != {}): False 7554 1726853191.83964: when evaluation is False, skipping this task 7554 1726853191.83966: _execute() done 7554 1726853191.83970: dumping result to json 7554 1726853191.83979: done dumping result, returning 7554 1726853191.84177: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-bdc3-98b6-00000000010e] 7554 1726853191.84181: sending task result for task 02083763-bbaf-bdc3-98b6-00000000010e 7554 1726853191.84246: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000010e 7554 1726853191.84250: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853191.84296: no more pending results, returning what we have 7554 1726853191.84299: results queue empty 7554 1726853191.84300: checking for any_errors_fatal 7554 1726853191.84309: done checking for any_errors_fatal 7554 1726853191.84310: checking for max_fail_percentage 7554 1726853191.84311: done checking for max_fail_percentage 7554 1726853191.84312: checking to see if all hosts have failed and the running result is not ok 7554 1726853191.84313: done checking to see if all hosts have failed 7554 1726853191.84314: getting the remaining hosts for this loop 7554 1726853191.84316: done getting the remaining hosts for this loop 7554 1726853191.84319: getting the next task for host managed_node3 7554 1726853191.84325: done getting next task for host managed_node3 7554 1726853191.84329: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7554 1726853191.84332: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853191.84355: getting variables 7554 1726853191.84357: in VariableManager get_vars() 7554 1726853191.84409: Calling all_inventory to load vars for managed_node3 7554 1726853191.84412: Calling groups_inventory to load vars for managed_node3 7554 1726853191.84414: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853191.84425: Calling all_plugins_play to load vars for managed_node3 7554 1726853191.84428: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853191.84431: Calling groups_plugins_play to load vars for managed_node3 7554 1726853191.85842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853191.87418: done with get_vars() 7554 1726853191.87444: done getting variables 7554 1726853191.87504: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:26:31 -0400 (0:00:00.051) 0:00:45.842 ****** 7554 1726853191.87537: entering _queue_task() for managed_node3/fail 7554 1726853191.88099: worker is 1 (out of 1 available) 7554 1726853191.88108: exiting _queue_task() for managed_node3/fail 7554 1726853191.88119: done queuing things up, now waiting for results queue to drain 7554 1726853191.88120: waiting for pending results... 7554 1726853191.88189: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7554 1726853191.88336: in run() - task 02083763-bbaf-bdc3-98b6-00000000010f 7554 1726853191.88578: variable 'ansible_search_path' from source: unknown 7554 1726853191.88582: variable 'ansible_search_path' from source: unknown 7554 1726853191.88584: calling self._execute() 7554 1726853191.88587: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853191.88590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853191.88592: variable 'omit' from source: magic vars 7554 1726853191.88917: variable 'ansible_distribution_major_version' from source: facts 7554 1726853191.88943: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853191.89119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853191.93246: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853191.94190: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853191.94257: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853191.94476: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853191.94480: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853191.94617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853191.94650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853191.94682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853191.94820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853191.94839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853191.95051: variable 'ansible_distribution_major_version' from source: facts 7554 1726853191.95075: Evaluated conditional (ansible_distribution_major_version | int > 9): True 7554 1726853191.95376: variable 'ansible_distribution' from source: facts 7554 1726853191.95380: variable '__network_rh_distros' from source: role '' defaults 7554 1726853191.95382: Evaluated conditional (ansible_distribution in __network_rh_distros): True 7554 1726853191.96052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853191.96055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853191.96058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853191.96061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853191.96063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853191.96465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853191.96869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853191.97036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853191.97039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853191.97042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853191.97047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853191.97050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853191.97052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853191.97054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853191.97056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853191.98010: variable 'network_connections' from source: task vars 7554 1726853191.98016: variable 'interface' from source: play vars 7554 1726853191.98199: variable 'interface' from source: play vars 7554 1726853191.98213: variable 'network_state' from source: role '' defaults 7554 1726853191.98372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853191.98828: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853191.99109: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853191.99113: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853191.99173: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853191.99315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853191.99401: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853191.99650: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853191.99654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853191.99657: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 7554 1726853191.99659: when evaluation is False, skipping this task 7554 1726853191.99661: _execute() done 7554 1726853191.99663: dumping result to json 7554 1726853191.99665: done dumping result, returning 7554 1726853191.99668: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-bdc3-98b6-00000000010f] 7554 1726853191.99672: sending task result for task 02083763-bbaf-bdc3-98b6-00000000010f skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 7554 1726853191.99909: no more pending results, returning what we have 7554 1726853191.99913: results queue empty 7554 1726853191.99914: checking for any_errors_fatal 7554 1726853191.99919: done checking for any_errors_fatal 7554 1726853191.99920: checking for max_fail_percentage 7554 1726853191.99921: done checking for max_fail_percentage 7554 1726853191.99922: checking to see if all hosts have failed and the running result is not ok 7554 1726853191.99924: done checking to see if all hosts have failed 7554 1726853191.99924: getting the remaining hosts for this loop 7554 1726853191.99926: done getting the remaining hosts for this loop 7554 1726853191.99930: getting the next task for host managed_node3 7554 1726853191.99937: done getting next task for host managed_node3 7554 1726853191.99941: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7554 1726853191.99943: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853191.99966: getting variables 7554 1726853191.99968: in VariableManager get_vars() 7554 1726853192.00019: Calling all_inventory to load vars for managed_node3 7554 1726853192.00021: Calling groups_inventory to load vars for managed_node3 7554 1726853192.00024: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853192.00034: Calling all_plugins_play to load vars for managed_node3 7554 1726853192.00037: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853192.00040: Calling groups_plugins_play to load vars for managed_node3 7554 1726853192.01188: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000010f 7554 1726853192.01191: WORKER PROCESS EXITING 7554 1726853192.04454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853192.07709: done with get_vars() 7554 1726853192.07740: done getting variables 7554 1726853192.08009: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:26:32 -0400 (0:00:00.205) 0:00:46.047 ****** 7554 1726853192.08043: entering _queue_task() for managed_node3/dnf 7554 1726853192.08811: worker is 1 (out of 1 available) 7554 1726853192.08824: exiting _queue_task() for managed_node3/dnf 7554 1726853192.08836: done queuing things up, now waiting for results queue to drain 7554 1726853192.08838: waiting for pending results... 7554 1726853192.09278: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7554 1726853192.09603: in run() - task 02083763-bbaf-bdc3-98b6-000000000110 7554 1726853192.09620: variable 'ansible_search_path' from source: unknown 7554 1726853192.09624: variable 'ansible_search_path' from source: unknown 7554 1726853192.09662: calling self._execute() 7554 1726853192.09767: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853192.09775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853192.09991: variable 'omit' from source: magic vars 7554 1726853192.10816: variable 'ansible_distribution_major_version' from source: facts 7554 1726853192.10830: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853192.11242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853192.14537: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853192.14616: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853192.14684: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853192.14690: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853192.14722: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853192.14876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.14881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.14884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.14900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.14914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.15132: variable 'ansible_distribution' from source: facts 7554 1726853192.15135: variable 'ansible_distribution_major_version' from source: facts 7554 1726853192.15138: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7554 1726853192.15195: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853192.15335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.15377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.15381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.15418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.15433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.15478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.15502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.15527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.15563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.15578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.15705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.15708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.15711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.15714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.15719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.15880: variable 'network_connections' from source: task vars 7554 1726853192.15894: variable 'interface' from source: play vars 7554 1726853192.15964: variable 'interface' from source: play vars 7554 1726853192.16137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853192.16223: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853192.16266: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853192.16297: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853192.16326: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853192.16372: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853192.16394: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853192.16418: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.16443: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853192.16492: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853192.16731: variable 'network_connections' from source: task vars 7554 1726853192.16736: variable 'interface' from source: play vars 7554 1726853192.16800: variable 'interface' from source: play vars 7554 1726853192.16829: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7554 1726853192.16832: when evaluation is False, skipping this task 7554 1726853192.16835: _execute() done 7554 1726853192.16837: dumping result to json 7554 1726853192.16839: done dumping result, returning 7554 1726853192.16938: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-000000000110] 7554 1726853192.16941: sending task result for task 02083763-bbaf-bdc3-98b6-000000000110 7554 1726853192.17185: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000110 7554 1726853192.17187: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7554 1726853192.17230: no more pending results, returning what we have 7554 1726853192.17232: results queue empty 7554 1726853192.17233: checking for any_errors_fatal 7554 1726853192.17238: done checking for any_errors_fatal 7554 1726853192.17239: checking for max_fail_percentage 7554 1726853192.17240: done checking for max_fail_percentage 7554 1726853192.17241: checking to see if all hosts have failed and the running result is not ok 7554 1726853192.17242: done checking to see if all hosts have failed 7554 1726853192.17243: getting the remaining hosts for this loop 7554 1726853192.17244: done getting the remaining hosts for this loop 7554 1726853192.17247: getting the next task for host managed_node3 7554 1726853192.17253: done getting next task for host managed_node3 7554 1726853192.17256: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7554 1726853192.17258: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853192.17277: getting variables 7554 1726853192.17278: in VariableManager get_vars() 7554 1726853192.17326: Calling all_inventory to load vars for managed_node3 7554 1726853192.17329: Calling groups_inventory to load vars for managed_node3 7554 1726853192.17332: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853192.17340: Calling all_plugins_play to load vars for managed_node3 7554 1726853192.17343: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853192.17346: Calling groups_plugins_play to load vars for managed_node3 7554 1726853192.20192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853192.23963: done with get_vars() 7554 1726853192.23998: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7554 1726853192.24084: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:26:32 -0400 (0:00:00.160) 0:00:46.208 ****** 7554 1726853192.24120: entering _queue_task() for managed_node3/yum 7554 1726853192.24934: worker is 1 (out of 1 available) 7554 1726853192.24948: exiting _queue_task() for managed_node3/yum 7554 1726853192.24960: done queuing things up, now waiting for results queue to drain 7554 1726853192.24962: waiting for pending results... 7554 1726853192.25773: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7554 1726853192.25781: in run() - task 02083763-bbaf-bdc3-98b6-000000000111 7554 1726853192.25785: variable 'ansible_search_path' from source: unknown 7554 1726853192.25788: variable 'ansible_search_path' from source: unknown 7554 1726853192.25901: calling self._execute() 7554 1726853192.26107: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853192.26119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853192.26130: variable 'omit' from source: magic vars 7554 1726853192.26911: variable 'ansible_distribution_major_version' from source: facts 7554 1726853192.26924: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853192.27318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853192.32831: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853192.32943: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853192.33493: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853192.33527: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853192.33556: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853192.33630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.33661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.33983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.33987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.33990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.34043: variable 'ansible_distribution_major_version' from source: facts 7554 1726853192.34093: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7554 1726853192.34097: when evaluation is False, skipping this task 7554 1726853192.34100: _execute() done 7554 1726853192.34102: dumping result to json 7554 1726853192.34678: done dumping result, returning 7554 1726853192.34683: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-000000000111] 7554 1726853192.34685: sending task result for task 02083763-bbaf-bdc3-98b6-000000000111 7554 1726853192.34750: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000111 7554 1726853192.34752: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7554 1726853192.34835: no more pending results, returning what we have 7554 1726853192.34838: results queue empty 7554 1726853192.34839: checking for any_errors_fatal 7554 1726853192.34846: done checking for any_errors_fatal 7554 1726853192.34847: checking for max_fail_percentage 7554 1726853192.34850: done checking for max_fail_percentage 7554 1726853192.34851: checking to see if all hosts have failed and the running result is not ok 7554 1726853192.34852: done checking to see if all hosts have failed 7554 1726853192.34852: getting the remaining hosts for this loop 7554 1726853192.34854: done getting the remaining hosts for this loop 7554 1726853192.34858: getting the next task for host managed_node3 7554 1726853192.34873: done getting next task for host managed_node3 7554 1726853192.34878: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7554 1726853192.34881: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853192.34906: getting variables 7554 1726853192.34908: in VariableManager get_vars() 7554 1726853192.34961: Calling all_inventory to load vars for managed_node3 7554 1726853192.34964: Calling groups_inventory to load vars for managed_node3 7554 1726853192.34967: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853192.35375: Calling all_plugins_play to load vars for managed_node3 7554 1726853192.35379: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853192.35383: Calling groups_plugins_play to load vars for managed_node3 7554 1726853192.39429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853192.42868: done with get_vars() 7554 1726853192.43059: done getting variables 7554 1726853192.43130: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:26:32 -0400 (0:00:00.190) 0:00:46.399 ****** 7554 1726853192.43203: entering _queue_task() for managed_node3/fail 7554 1726853192.43996: worker is 1 (out of 1 available) 7554 1726853192.44125: exiting _queue_task() for managed_node3/fail 7554 1726853192.44136: done queuing things up, now waiting for results queue to drain 7554 1726853192.44137: waiting for pending results... 7554 1726853192.44687: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7554 1726853192.44741: in run() - task 02083763-bbaf-bdc3-98b6-000000000112 7554 1726853192.44977: variable 'ansible_search_path' from source: unknown 7554 1726853192.44982: variable 'ansible_search_path' from source: unknown 7554 1726853192.44985: calling self._execute() 7554 1726853192.45108: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853192.45178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853192.45211: variable 'omit' from source: magic vars 7554 1726853192.46142: variable 'ansible_distribution_major_version' from source: facts 7554 1726853192.46161: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853192.46478: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853192.46957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853192.50012: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853192.50144: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853192.50245: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853192.50337: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853192.50377: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853192.50461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.50501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.50530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.50582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.50604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.50659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.50689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.50719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.50765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.50807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.50835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.50865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.50895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.50977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.50981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.51150: variable 'network_connections' from source: task vars 7554 1726853192.51169: variable 'interface' from source: play vars 7554 1726853192.51250: variable 'interface' from source: play vars 7554 1726853192.51331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853192.51512: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853192.51575: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853192.51609: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853192.51645: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853192.51695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853192.51719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853192.51753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.51787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853192.51845: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853192.52144: variable 'network_connections' from source: task vars 7554 1726853192.52190: variable 'interface' from source: play vars 7554 1726853192.52340: variable 'interface' from source: play vars 7554 1726853192.52435: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7554 1726853192.52439: when evaluation is False, skipping this task 7554 1726853192.52441: _execute() done 7554 1726853192.52443: dumping result to json 7554 1726853192.52445: done dumping result, returning 7554 1726853192.52448: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-000000000112] 7554 1726853192.52449: sending task result for task 02083763-bbaf-bdc3-98b6-000000000112 7554 1726853192.52655: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000112 7554 1726853192.52658: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7554 1726853192.52713: no more pending results, returning what we have 7554 1726853192.52717: results queue empty 7554 1726853192.52718: checking for any_errors_fatal 7554 1726853192.52726: done checking for any_errors_fatal 7554 1726853192.52726: checking for max_fail_percentage 7554 1726853192.52728: done checking for max_fail_percentage 7554 1726853192.52729: checking to see if all hosts have failed and the running result is not ok 7554 1726853192.52731: done checking to see if all hosts have failed 7554 1726853192.52731: getting the remaining hosts for this loop 7554 1726853192.52733: done getting the remaining hosts for this loop 7554 1726853192.52736: getting the next task for host managed_node3 7554 1726853192.52744: done getting next task for host managed_node3 7554 1726853192.52748: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7554 1726853192.52751: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853192.52776: getting variables 7554 1726853192.52778: in VariableManager get_vars() 7554 1726853192.52829: Calling all_inventory to load vars for managed_node3 7554 1726853192.52831: Calling groups_inventory to load vars for managed_node3 7554 1726853192.52834: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853192.52845: Calling all_plugins_play to load vars for managed_node3 7554 1726853192.52849: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853192.52852: Calling groups_plugins_play to load vars for managed_node3 7554 1726853192.54805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853192.56705: done with get_vars() 7554 1726853192.56726: done getting variables 7554 1726853192.56790: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:26:32 -0400 (0:00:00.136) 0:00:46.535 ****** 7554 1726853192.56823: entering _queue_task() for managed_node3/package 7554 1726853192.57194: worker is 1 (out of 1 available) 7554 1726853192.57208: exiting _queue_task() for managed_node3/package 7554 1726853192.57221: done queuing things up, now waiting for results queue to drain 7554 1726853192.57222: waiting for pending results... 7554 1726853192.57693: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7554 1726853192.57703: in run() - task 02083763-bbaf-bdc3-98b6-000000000113 7554 1726853192.57716: variable 'ansible_search_path' from source: unknown 7554 1726853192.57724: variable 'ansible_search_path' from source: unknown 7554 1726853192.57770: calling self._execute() 7554 1726853192.57894: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853192.57905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853192.57929: variable 'omit' from source: magic vars 7554 1726853192.58328: variable 'ansible_distribution_major_version' from source: facts 7554 1726853192.58355: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853192.58559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853192.58852: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853192.58915: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853192.59013: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853192.59089: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853192.59335: variable 'network_packages' from source: role '' defaults 7554 1726853192.59383: variable '__network_provider_setup' from source: role '' defaults 7554 1726853192.59401: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853192.59563: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853192.59566: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853192.59568: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853192.59759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853192.63124: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853192.63204: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853192.63250: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853192.63295: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853192.63334: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853192.63435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.63476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.63536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.63564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.63585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.63633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.63753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.63756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.63759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.63761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.63983: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7554 1726853192.64295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.64298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.64300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.64384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.64408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.64506: variable 'ansible_python' from source: facts 7554 1726853192.64654: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7554 1726853192.65058: variable '__network_wpa_supplicant_required' from source: role '' defaults 7554 1726853192.65061: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7554 1726853192.65248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.65359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.65394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.65436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.65468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.65539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.65582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.65614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.65657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.65677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.65837: variable 'network_connections' from source: task vars 7554 1726853192.65851: variable 'interface' from source: play vars 7554 1726853192.65959: variable 'interface' from source: play vars 7554 1726853192.66030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853192.66068: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853192.66103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.66136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853192.66193: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853192.66493: variable 'network_connections' from source: task vars 7554 1726853192.66503: variable 'interface' from source: play vars 7554 1726853192.66609: variable 'interface' from source: play vars 7554 1726853192.66649: variable '__network_packages_default_wireless' from source: role '' defaults 7554 1726853192.66731: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853192.67050: variable 'network_connections' from source: task vars 7554 1726853192.67061: variable 'interface' from source: play vars 7554 1726853192.67136: variable 'interface' from source: play vars 7554 1726853192.67167: variable '__network_packages_default_team' from source: role '' defaults 7554 1726853192.67256: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853192.67881: variable 'network_connections' from source: task vars 7554 1726853192.67884: variable 'interface' from source: play vars 7554 1726853192.67886: variable 'interface' from source: play vars 7554 1726853192.67938: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853192.68012: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853192.68024: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853192.68094: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853192.68324: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7554 1726853192.68822: variable 'network_connections' from source: task vars 7554 1726853192.68832: variable 'interface' from source: play vars 7554 1726853192.68901: variable 'interface' from source: play vars 7554 1726853192.68914: variable 'ansible_distribution' from source: facts 7554 1726853192.68922: variable '__network_rh_distros' from source: role '' defaults 7554 1726853192.68932: variable 'ansible_distribution_major_version' from source: facts 7554 1726853192.68953: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7554 1726853192.69176: variable 'ansible_distribution' from source: facts 7554 1726853192.69190: variable '__network_rh_distros' from source: role '' defaults 7554 1726853192.69293: variable 'ansible_distribution_major_version' from source: facts 7554 1726853192.69296: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7554 1726853192.69467: variable 'ansible_distribution' from source: facts 7554 1726853192.69470: variable '__network_rh_distros' from source: role '' defaults 7554 1726853192.69475: variable 'ansible_distribution_major_version' from source: facts 7554 1726853192.69499: variable 'network_provider' from source: set_fact 7554 1726853192.69525: variable 'ansible_facts' from source: unknown 7554 1726853192.70299: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7554 1726853192.70308: when evaluation is False, skipping this task 7554 1726853192.70316: _execute() done 7554 1726853192.70379: dumping result to json 7554 1726853192.70386: done dumping result, returning 7554 1726853192.70389: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-bdc3-98b6-000000000113] 7554 1726853192.70391: sending task result for task 02083763-bbaf-bdc3-98b6-000000000113 7554 1726853192.70464: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000113 7554 1726853192.70467: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7554 1726853192.70538: no more pending results, returning what we have 7554 1726853192.70541: results queue empty 7554 1726853192.70542: checking for any_errors_fatal 7554 1726853192.70553: done checking for any_errors_fatal 7554 1726853192.70554: checking for max_fail_percentage 7554 1726853192.70556: done checking for max_fail_percentage 7554 1726853192.70557: checking to see if all hosts have failed and the running result is not ok 7554 1726853192.70558: done checking to see if all hosts have failed 7554 1726853192.70559: getting the remaining hosts for this loop 7554 1726853192.70560: done getting the remaining hosts for this loop 7554 1726853192.70564: getting the next task for host managed_node3 7554 1726853192.70570: done getting next task for host managed_node3 7554 1726853192.70576: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7554 1726853192.70578: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853192.70600: getting variables 7554 1726853192.70602: in VariableManager get_vars() 7554 1726853192.70654: Calling all_inventory to load vars for managed_node3 7554 1726853192.70656: Calling groups_inventory to load vars for managed_node3 7554 1726853192.70659: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853192.70668: Calling all_plugins_play to load vars for managed_node3 7554 1726853192.70875: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853192.70881: Calling groups_plugins_play to load vars for managed_node3 7554 1726853192.72336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853192.73883: done with get_vars() 7554 1726853192.73907: done getting variables 7554 1726853192.73966: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:26:32 -0400 (0:00:00.171) 0:00:46.707 ****** 7554 1726853192.74001: entering _queue_task() for managed_node3/package 7554 1726853192.74340: worker is 1 (out of 1 available) 7554 1726853192.74356: exiting _queue_task() for managed_node3/package 7554 1726853192.74368: done queuing things up, now waiting for results queue to drain 7554 1726853192.74370: waiting for pending results... 7554 1726853192.74789: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7554 1726853192.74829: in run() - task 02083763-bbaf-bdc3-98b6-000000000114 7554 1726853192.74910: variable 'ansible_search_path' from source: unknown 7554 1726853192.74914: variable 'ansible_search_path' from source: unknown 7554 1726853192.74917: calling self._execute() 7554 1726853192.75024: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853192.75039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853192.75058: variable 'omit' from source: magic vars 7554 1726853192.75457: variable 'ansible_distribution_major_version' from source: facts 7554 1726853192.75478: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853192.75605: variable 'network_state' from source: role '' defaults 7554 1726853192.75672: Evaluated conditional (network_state != {}): False 7554 1726853192.75675: when evaluation is False, skipping this task 7554 1726853192.75678: _execute() done 7554 1726853192.75680: dumping result to json 7554 1726853192.75682: done dumping result, returning 7554 1726853192.75684: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-bdc3-98b6-000000000114] 7554 1726853192.75687: sending task result for task 02083763-bbaf-bdc3-98b6-000000000114 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853192.75817: no more pending results, returning what we have 7554 1726853192.75822: results queue empty 7554 1726853192.75822: checking for any_errors_fatal 7554 1726853192.75830: done checking for any_errors_fatal 7554 1726853192.75831: checking for max_fail_percentage 7554 1726853192.75833: done checking for max_fail_percentage 7554 1726853192.75834: checking to see if all hosts have failed and the running result is not ok 7554 1726853192.75835: done checking to see if all hosts have failed 7554 1726853192.75835: getting the remaining hosts for this loop 7554 1726853192.75837: done getting the remaining hosts for this loop 7554 1726853192.75841: getting the next task for host managed_node3 7554 1726853192.75849: done getting next task for host managed_node3 7554 1726853192.75853: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7554 1726853192.75856: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853192.75884: getting variables 7554 1726853192.75886: in VariableManager get_vars() 7554 1726853192.75934: Calling all_inventory to load vars for managed_node3 7554 1726853192.75937: Calling groups_inventory to load vars for managed_node3 7554 1726853192.75940: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853192.75954: Calling all_plugins_play to load vars for managed_node3 7554 1726853192.75957: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853192.75960: Calling groups_plugins_play to load vars for managed_node3 7554 1726853192.76609: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000114 7554 1726853192.76613: WORKER PROCESS EXITING 7554 1726853192.77765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853192.79300: done with get_vars() 7554 1726853192.79322: done getting variables 7554 1726853192.79388: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:26:32 -0400 (0:00:00.054) 0:00:46.761 ****** 7554 1726853192.79423: entering _queue_task() for managed_node3/package 7554 1726853192.79751: worker is 1 (out of 1 available) 7554 1726853192.79763: exiting _queue_task() for managed_node3/package 7554 1726853192.79976: done queuing things up, now waiting for results queue to drain 7554 1726853192.79978: waiting for pending results... 7554 1726853192.80059: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7554 1726853192.80278: in run() - task 02083763-bbaf-bdc3-98b6-000000000115 7554 1726853192.80281: variable 'ansible_search_path' from source: unknown 7554 1726853192.80283: variable 'ansible_search_path' from source: unknown 7554 1726853192.80285: calling self._execute() 7554 1726853192.80372: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853192.80384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853192.80398: variable 'omit' from source: magic vars 7554 1726853192.80795: variable 'ansible_distribution_major_version' from source: facts 7554 1726853192.80812: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853192.80935: variable 'network_state' from source: role '' defaults 7554 1726853192.80953: Evaluated conditional (network_state != {}): False 7554 1726853192.80967: when evaluation is False, skipping this task 7554 1726853192.80978: _execute() done 7554 1726853192.81074: dumping result to json 7554 1726853192.81077: done dumping result, returning 7554 1726853192.81080: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-bdc3-98b6-000000000115] 7554 1726853192.81083: sending task result for task 02083763-bbaf-bdc3-98b6-000000000115 7554 1726853192.81154: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000115 7554 1726853192.81157: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853192.81219: no more pending results, returning what we have 7554 1726853192.81223: results queue empty 7554 1726853192.81224: checking for any_errors_fatal 7554 1726853192.81234: done checking for any_errors_fatal 7554 1726853192.81235: checking for max_fail_percentage 7554 1726853192.81237: done checking for max_fail_percentage 7554 1726853192.81238: checking to see if all hosts have failed and the running result is not ok 7554 1726853192.81239: done checking to see if all hosts have failed 7554 1726853192.81240: getting the remaining hosts for this loop 7554 1726853192.81242: done getting the remaining hosts for this loop 7554 1726853192.81248: getting the next task for host managed_node3 7554 1726853192.81256: done getting next task for host managed_node3 7554 1726853192.81260: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7554 1726853192.81263: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853192.81290: getting variables 7554 1726853192.81292: in VariableManager get_vars() 7554 1726853192.81340: Calling all_inventory to load vars for managed_node3 7554 1726853192.81346: Calling groups_inventory to load vars for managed_node3 7554 1726853192.81349: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853192.81361: Calling all_plugins_play to load vars for managed_node3 7554 1726853192.81365: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853192.81368: Calling groups_plugins_play to load vars for managed_node3 7554 1726853192.82874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853192.84388: done with get_vars() 7554 1726853192.84409: done getting variables 7554 1726853192.84463: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:26:32 -0400 (0:00:00.050) 0:00:46.812 ****** 7554 1726853192.84497: entering _queue_task() for managed_node3/service 7554 1726853192.84896: worker is 1 (out of 1 available) 7554 1726853192.84908: exiting _queue_task() for managed_node3/service 7554 1726853192.84918: done queuing things up, now waiting for results queue to drain 7554 1726853192.84920: waiting for pending results... 7554 1726853192.85161: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7554 1726853192.85277: in run() - task 02083763-bbaf-bdc3-98b6-000000000116 7554 1726853192.85476: variable 'ansible_search_path' from source: unknown 7554 1726853192.85480: variable 'ansible_search_path' from source: unknown 7554 1726853192.85482: calling self._execute() 7554 1726853192.85485: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853192.85487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853192.85489: variable 'omit' from source: magic vars 7554 1726853192.85848: variable 'ansible_distribution_major_version' from source: facts 7554 1726853192.85867: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853192.85994: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853192.86195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853192.88781: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853192.88852: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853192.88911: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853192.88953: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853192.88988: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853192.89068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.89111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.89145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.89196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.89217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.89269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.89303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.89334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.89381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.89411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.89475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.89482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.89511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.89560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.89629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.89765: variable 'network_connections' from source: task vars 7554 1726853192.89787: variable 'interface' from source: play vars 7554 1726853192.89866: variable 'interface' from source: play vars 7554 1726853192.89948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853192.90126: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853192.90178: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853192.90216: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853192.90287: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853192.90313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853192.90340: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853192.90380: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.90476: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853192.90481: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853192.90736: variable 'network_connections' from source: task vars 7554 1726853192.90748: variable 'interface' from source: play vars 7554 1726853192.90810: variable 'interface' from source: play vars 7554 1726853192.90840: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7554 1726853192.90851: when evaluation is False, skipping this task 7554 1726853192.90857: _execute() done 7554 1726853192.90862: dumping result to json 7554 1726853192.90867: done dumping result, returning 7554 1726853192.90879: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-bdc3-98b6-000000000116] 7554 1726853192.90888: sending task result for task 02083763-bbaf-bdc3-98b6-000000000116 7554 1726853192.91127: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000116 7554 1726853192.91137: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7554 1726853192.91191: no more pending results, returning what we have 7554 1726853192.91194: results queue empty 7554 1726853192.91195: checking for any_errors_fatal 7554 1726853192.91201: done checking for any_errors_fatal 7554 1726853192.91201: checking for max_fail_percentage 7554 1726853192.91203: done checking for max_fail_percentage 7554 1726853192.91204: checking to see if all hosts have failed and the running result is not ok 7554 1726853192.91206: done checking to see if all hosts have failed 7554 1726853192.91206: getting the remaining hosts for this loop 7554 1726853192.91208: done getting the remaining hosts for this loop 7554 1726853192.91212: getting the next task for host managed_node3 7554 1726853192.91218: done getting next task for host managed_node3 7554 1726853192.91222: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7554 1726853192.91225: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853192.91248: getting variables 7554 1726853192.91249: in VariableManager get_vars() 7554 1726853192.91296: Calling all_inventory to load vars for managed_node3 7554 1726853192.91298: Calling groups_inventory to load vars for managed_node3 7554 1726853192.91301: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853192.91310: Calling all_plugins_play to load vars for managed_node3 7554 1726853192.91313: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853192.91315: Calling groups_plugins_play to load vars for managed_node3 7554 1726853192.93039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853192.94546: done with get_vars() 7554 1726853192.94568: done getting variables 7554 1726853192.94629: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:26:32 -0400 (0:00:00.101) 0:00:46.914 ****** 7554 1726853192.94664: entering _queue_task() for managed_node3/service 7554 1726853192.94995: worker is 1 (out of 1 available) 7554 1726853192.95008: exiting _queue_task() for managed_node3/service 7554 1726853192.95020: done queuing things up, now waiting for results queue to drain 7554 1726853192.95022: waiting for pending results... 7554 1726853192.95317: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7554 1726853192.95481: in run() - task 02083763-bbaf-bdc3-98b6-000000000117 7554 1726853192.95507: variable 'ansible_search_path' from source: unknown 7554 1726853192.95516: variable 'ansible_search_path' from source: unknown 7554 1726853192.95558: calling self._execute() 7554 1726853192.95712: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853192.95716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853192.95720: variable 'omit' from source: magic vars 7554 1726853192.96090: variable 'ansible_distribution_major_version' from source: facts 7554 1726853192.96108: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853192.96290: variable 'network_provider' from source: set_fact 7554 1726853192.96301: variable 'network_state' from source: role '' defaults 7554 1726853192.96476: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7554 1726853192.96479: variable 'omit' from source: magic vars 7554 1726853192.96481: variable 'omit' from source: magic vars 7554 1726853192.96484: variable 'network_service_name' from source: role '' defaults 7554 1726853192.96486: variable 'network_service_name' from source: role '' defaults 7554 1726853192.96591: variable '__network_provider_setup' from source: role '' defaults 7554 1726853192.96607: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853192.96675: variable '__network_service_name_default_nm' from source: role '' defaults 7554 1726853192.96690: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853192.96761: variable '__network_packages_default_nm' from source: role '' defaults 7554 1726853192.96997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853192.99234: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853192.99308: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853192.99350: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853192.99389: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853192.99422: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853192.99504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.99541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.99569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.99614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.99633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853192.99754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853192.99757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853192.99760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853192.99783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853192.99803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853193.00036: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7554 1726853193.00152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853193.00182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853193.00208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853193.00244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853193.00259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853193.00351: variable 'ansible_python' from source: facts 7554 1726853193.00379: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7554 1726853193.00457: variable '__network_wpa_supplicant_required' from source: role '' defaults 7554 1726853193.00569: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7554 1726853193.00876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853193.00879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853193.00891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853193.00931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853193.00991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853193.01200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853193.01237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853193.01273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853193.01318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853193.01336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853193.01480: variable 'network_connections' from source: task vars 7554 1726853193.01493: variable 'interface' from source: play vars 7554 1726853193.01567: variable 'interface' from source: play vars 7554 1726853193.01673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853193.01883: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853193.02080: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853193.02083: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853193.02085: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853193.02088: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853193.02113: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853193.02148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853193.02193: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853193.02247: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853193.02586: variable 'network_connections' from source: task vars 7554 1726853193.02597: variable 'interface' from source: play vars 7554 1726853193.02681: variable 'interface' from source: play vars 7554 1726853193.02718: variable '__network_packages_default_wireless' from source: role '' defaults 7554 1726853193.02809: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853193.03106: variable 'network_connections' from source: task vars 7554 1726853193.03116: variable 'interface' from source: play vars 7554 1726853193.03191: variable 'interface' from source: play vars 7554 1726853193.03217: variable '__network_packages_default_team' from source: role '' defaults 7554 1726853193.03301: variable '__network_team_connections_defined' from source: role '' defaults 7554 1726853193.03590: variable 'network_connections' from source: task vars 7554 1726853193.03603: variable 'interface' from source: play vars 7554 1726853193.03673: variable 'interface' from source: play vars 7554 1726853193.03732: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853193.03795: variable '__network_service_name_default_initscripts' from source: role '' defaults 7554 1726853193.03806: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853193.03869: variable '__network_packages_default_initscripts' from source: role '' defaults 7554 1726853193.04091: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7554 1726853193.05004: variable 'network_connections' from source: task vars 7554 1726853193.05077: variable 'interface' from source: play vars 7554 1726853193.05081: variable 'interface' from source: play vars 7554 1726853193.05123: variable 'ansible_distribution' from source: facts 7554 1726853193.05333: variable '__network_rh_distros' from source: role '' defaults 7554 1726853193.05337: variable 'ansible_distribution_major_version' from source: facts 7554 1726853193.05339: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7554 1726853193.05575: variable 'ansible_distribution' from source: facts 7554 1726853193.05584: variable '__network_rh_distros' from source: role '' defaults 7554 1726853193.05593: variable 'ansible_distribution_major_version' from source: facts 7554 1726853193.05608: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7554 1726853193.05860: variable 'ansible_distribution' from source: facts 7554 1726853193.05869: variable '__network_rh_distros' from source: role '' defaults 7554 1726853193.05885: variable 'ansible_distribution_major_version' from source: facts 7554 1726853193.05925: variable 'network_provider' from source: set_fact 7554 1726853193.05954: variable 'omit' from source: magic vars 7554 1726853193.05994: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853193.06025: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853193.06095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853193.06099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853193.06101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853193.06119: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853193.06127: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853193.06134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853193.06247: Set connection var ansible_shell_executable to /bin/sh 7554 1726853193.06262: Set connection var ansible_pipelining to False 7554 1726853193.06270: Set connection var ansible_shell_type to sh 7554 1726853193.06279: Set connection var ansible_connection to ssh 7554 1726853193.06292: Set connection var ansible_timeout to 10 7554 1726853193.06312: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853193.06332: variable 'ansible_shell_executable' from source: unknown 7554 1726853193.06339: variable 'ansible_connection' from source: unknown 7554 1726853193.06376: variable 'ansible_module_compression' from source: unknown 7554 1726853193.06379: variable 'ansible_shell_type' from source: unknown 7554 1726853193.06381: variable 'ansible_shell_executable' from source: unknown 7554 1726853193.06383: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853193.06385: variable 'ansible_pipelining' from source: unknown 7554 1726853193.06387: variable 'ansible_timeout' from source: unknown 7554 1726853193.06389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853193.06494: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853193.06511: variable 'omit' from source: magic vars 7554 1726853193.06576: starting attempt loop 7554 1726853193.06580: running the handler 7554 1726853193.06621: variable 'ansible_facts' from source: unknown 7554 1726853193.07407: _low_level_execute_command(): starting 7554 1726853193.07422: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853193.08172: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853193.08225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853193.08242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853193.08274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853193.08478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853193.10118: stdout chunk (state=3): >>>/root <<< 7554 1726853193.10243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853193.10252: stdout chunk (state=3): >>><<< 7554 1726853193.10262: stderr chunk (state=3): >>><<< 7554 1726853193.10410: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853193.10414: _low_level_execute_command(): starting 7554 1726853193.10417: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853193.103335-9273-206896872236950 `" && echo ansible-tmp-1726853193.103335-9273-206896872236950="` echo /root/.ansible/tmp/ansible-tmp-1726853193.103335-9273-206896872236950 `" ) && sleep 0' 7554 1726853193.11589: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853193.11633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853193.11785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853193.11833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853193.12036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853193.14028: stdout chunk (state=3): >>>ansible-tmp-1726853193.103335-9273-206896872236950=/root/.ansible/tmp/ansible-tmp-1726853193.103335-9273-206896872236950 <<< 7554 1726853193.14134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853193.14165: stderr chunk (state=3): >>><<< 7554 1726853193.14175: stdout chunk (state=3): >>><<< 7554 1726853193.14198: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853193.103335-9273-206896872236950=/root/.ansible/tmp/ansible-tmp-1726853193.103335-9273-206896872236950 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853193.14402: variable 'ansible_module_compression' from source: unknown 7554 1726853193.14405: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 7554 1726853193.14463: variable 'ansible_facts' from source: unknown 7554 1726853193.14894: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853193.103335-9273-206896872236950/AnsiballZ_systemd.py 7554 1726853193.15294: Sending initial data 7554 1726853193.15297: Sent initial data (153 bytes) 7554 1726853193.16586: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853193.16746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853193.16835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853193.18807: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853193.18866: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853193.18931: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmput3fson5 /root/.ansible/tmp/ansible-tmp-1726853193.103335-9273-206896872236950/AnsiballZ_systemd.py <<< 7554 1726853193.18935: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853193.103335-9273-206896872236950/AnsiballZ_systemd.py" <<< 7554 1726853193.19082: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmput3fson5" to remote "/root/.ansible/tmp/ansible-tmp-1726853193.103335-9273-206896872236950/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853193.103335-9273-206896872236950/AnsiballZ_systemd.py" <<< 7554 1726853193.21816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853193.21869: stderr chunk (state=3): >>><<< 7554 1726853193.21875: stdout chunk (state=3): >>><<< 7554 1726853193.21892: done transferring module to remote 7554 1726853193.21903: _low_level_execute_command(): starting 7554 1726853193.21907: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853193.103335-9273-206896872236950/ /root/.ansible/tmp/ansible-tmp-1726853193.103335-9273-206896872236950/AnsiballZ_systemd.py && sleep 0' 7554 1726853193.22995: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853193.23278: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853193.23588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853193.23712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853193.25532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853193.25579: stderr chunk (state=3): >>><<< 7554 1726853193.25587: stdout chunk (state=3): >>><<< 7554 1726853193.25609: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853193.25612: _low_level_execute_command(): starting 7554 1726853193.25663: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853193.103335-9273-206896872236950/AnsiballZ_systemd.py && sleep 0' 7554 1726853193.26660: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853193.26716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853193.26733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853193.26739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853193.26744: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853193.26753: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853193.26759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853193.26877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853193.26965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853193.26969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853193.27039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853193.59984: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9547776", "MemoryPeak": "10080256", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3329708032", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "211184000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", <<< 7554 1726853193.59999: stdout chunk (state=3): >>>"MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service multi-user.target cl<<< 7554 1726853193.60003: stdout chunk (state=3): >>>oud-init.service network.target", "After": "cloud-init-local.service network-pre.target dbus.socket system.slice sysinit.target dbus-broker.service basic.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:21:20 EDT", "StateChangeTimestampMonotonic": "24855925", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7554 1726853193.62141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853193.62167: stderr chunk (state=3): >>><<< 7554 1726853193.62172: stdout chunk (state=3): >>><<< 7554 1726853193.62191: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 13:21:20 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9547776", "MemoryPeak": "10080256", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3329708032", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "211184000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service multi-user.target cloud-init.service network.target", "After": "cloud-init-local.service network-pre.target dbus.socket system.slice sysinit.target dbus-broker.service basic.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:21:20 EDT", "StateChangeTimestampMonotonic": "24855925", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853193.62881: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853193.103335-9273-206896872236950/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853193.62885: _low_level_execute_command(): starting 7554 1726853193.62888: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853193.103335-9273-206896872236950/ > /dev/null 2>&1 && sleep 0' 7554 1726853193.63875: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853193.63893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853193.63968: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853193.64005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853193.64017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853193.64091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853193.66061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853193.66064: stdout chunk (state=3): >>><<< 7554 1726853193.66066: stderr chunk (state=3): >>><<< 7554 1726853193.66277: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853193.66280: handler run complete 7554 1726853193.66283: attempt loop complete, returning result 7554 1726853193.66285: _execute() done 7554 1726853193.66286: dumping result to json 7554 1726853193.66288: done dumping result, returning 7554 1726853193.66290: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-bdc3-98b6-000000000117] 7554 1726853193.66292: sending task result for task 02083763-bbaf-bdc3-98b6-000000000117 7554 1726853193.67077: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000117 7554 1726853193.67081: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853193.67161: no more pending results, returning what we have 7554 1726853193.67166: results queue empty 7554 1726853193.67167: checking for any_errors_fatal 7554 1726853193.67174: done checking for any_errors_fatal 7554 1726853193.67174: checking for max_fail_percentage 7554 1726853193.67176: done checking for max_fail_percentage 7554 1726853193.67177: checking to see if all hosts have failed and the running result is not ok 7554 1726853193.67178: done checking to see if all hosts have failed 7554 1726853193.67179: getting the remaining hosts for this loop 7554 1726853193.67180: done getting the remaining hosts for this loop 7554 1726853193.67184: getting the next task for host managed_node3 7554 1726853193.67190: done getting next task for host managed_node3 7554 1726853193.67194: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7554 1726853193.67198: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853193.67215: getting variables 7554 1726853193.67217: in VariableManager get_vars() 7554 1726853193.67263: Calling all_inventory to load vars for managed_node3 7554 1726853193.67265: Calling groups_inventory to load vars for managed_node3 7554 1726853193.67268: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853193.67315: Calling all_plugins_play to load vars for managed_node3 7554 1726853193.67323: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853193.67476: Calling groups_plugins_play to load vars for managed_node3 7554 1726853193.68757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853193.70648: done with get_vars() 7554 1726853193.70670: done getting variables 7554 1726853193.70717: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:26:33 -0400 (0:00:00.760) 0:00:47.674 ****** 7554 1726853193.70741: entering _queue_task() for managed_node3/service 7554 1726853193.71013: worker is 1 (out of 1 available) 7554 1726853193.71059: exiting _queue_task() for managed_node3/service 7554 1726853193.71070: done queuing things up, now waiting for results queue to drain 7554 1726853193.71073: waiting for pending results... 7554 1726853193.71311: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7554 1726853193.71577: in run() - task 02083763-bbaf-bdc3-98b6-000000000118 7554 1726853193.71581: variable 'ansible_search_path' from source: unknown 7554 1726853193.71584: variable 'ansible_search_path' from source: unknown 7554 1726853193.71587: calling self._execute() 7554 1726853193.71604: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853193.71615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853193.71628: variable 'omit' from source: magic vars 7554 1726853193.72012: variable 'ansible_distribution_major_version' from source: facts 7554 1726853193.72031: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853193.72157: variable 'network_provider' from source: set_fact 7554 1726853193.72168: Evaluated conditional (network_provider == "nm"): True 7554 1726853193.72267: variable '__network_wpa_supplicant_required' from source: role '' defaults 7554 1726853193.72365: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7554 1726853193.72536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853193.74566: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853193.74610: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853193.74640: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853193.74666: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853193.74688: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853193.74756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853193.74780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853193.74799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853193.74824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853193.74835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853193.74876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853193.74893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853193.74909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853193.74933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853193.74948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853193.74979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853193.74995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853193.75010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853193.75035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853193.75047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853193.75149: variable 'network_connections' from source: task vars 7554 1726853193.75159: variable 'interface' from source: play vars 7554 1726853193.75211: variable 'interface' from source: play vars 7554 1726853193.75262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853193.75376: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853193.75405: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853193.75427: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853193.75449: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853193.75479: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7554 1726853193.75495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7554 1726853193.75516: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853193.75533: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7554 1726853193.75574: variable '__network_wireless_connections_defined' from source: role '' defaults 7554 1726853193.75727: variable 'network_connections' from source: task vars 7554 1726853193.75731: variable 'interface' from source: play vars 7554 1726853193.75776: variable 'interface' from source: play vars 7554 1726853193.75798: Evaluated conditional (__network_wpa_supplicant_required): False 7554 1726853193.75801: when evaluation is False, skipping this task 7554 1726853193.75804: _execute() done 7554 1726853193.75806: dumping result to json 7554 1726853193.75809: done dumping result, returning 7554 1726853193.75816: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-bdc3-98b6-000000000118] 7554 1726853193.75829: sending task result for task 02083763-bbaf-bdc3-98b6-000000000118 7554 1726853193.75913: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000118 7554 1726853193.75916: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7554 1726853193.75992: no more pending results, returning what we have 7554 1726853193.75995: results queue empty 7554 1726853193.75996: checking for any_errors_fatal 7554 1726853193.76027: done checking for any_errors_fatal 7554 1726853193.76028: checking for max_fail_percentage 7554 1726853193.76029: done checking for max_fail_percentage 7554 1726853193.76030: checking to see if all hosts have failed and the running result is not ok 7554 1726853193.76031: done checking to see if all hosts have failed 7554 1726853193.76032: getting the remaining hosts for this loop 7554 1726853193.76033: done getting the remaining hosts for this loop 7554 1726853193.76037: getting the next task for host managed_node3 7554 1726853193.76045: done getting next task for host managed_node3 7554 1726853193.76050: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7554 1726853193.76053: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853193.76073: getting variables 7554 1726853193.76075: in VariableManager get_vars() 7554 1726853193.76136: Calling all_inventory to load vars for managed_node3 7554 1726853193.76138: Calling groups_inventory to load vars for managed_node3 7554 1726853193.76141: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853193.76152: Calling all_plugins_play to load vars for managed_node3 7554 1726853193.76155: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853193.76157: Calling groups_plugins_play to load vars for managed_node3 7554 1726853193.77410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853193.78288: done with get_vars() 7554 1726853193.78305: done getting variables 7554 1726853193.78352: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:26:33 -0400 (0:00:00.076) 0:00:47.751 ****** 7554 1726853193.78378: entering _queue_task() for managed_node3/service 7554 1726853193.78626: worker is 1 (out of 1 available) 7554 1726853193.78639: exiting _queue_task() for managed_node3/service 7554 1726853193.78654: done queuing things up, now waiting for results queue to drain 7554 1726853193.78656: waiting for pending results... 7554 1726853193.78836: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7554 1726853193.78932: in run() - task 02083763-bbaf-bdc3-98b6-000000000119 7554 1726853193.78945: variable 'ansible_search_path' from source: unknown 7554 1726853193.78950: variable 'ansible_search_path' from source: unknown 7554 1726853193.78977: calling self._execute() 7554 1726853193.79062: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853193.79066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853193.79076: variable 'omit' from source: magic vars 7554 1726853193.79357: variable 'ansible_distribution_major_version' from source: facts 7554 1726853193.79367: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853193.79531: variable 'network_provider' from source: set_fact 7554 1726853193.79534: Evaluated conditional (network_provider == "initscripts"): False 7554 1726853193.79536: when evaluation is False, skipping this task 7554 1726853193.79539: _execute() done 7554 1726853193.79541: dumping result to json 7554 1726853193.79546: done dumping result, returning 7554 1726853193.79548: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-bdc3-98b6-000000000119] 7554 1726853193.79549: sending task result for task 02083763-bbaf-bdc3-98b6-000000000119 7554 1726853193.79617: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000119 7554 1726853193.79620: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7554 1726853193.79672: no more pending results, returning what we have 7554 1726853193.79675: results queue empty 7554 1726853193.79676: checking for any_errors_fatal 7554 1726853193.79688: done checking for any_errors_fatal 7554 1726853193.79689: checking for max_fail_percentage 7554 1726853193.79691: done checking for max_fail_percentage 7554 1726853193.79691: checking to see if all hosts have failed and the running result is not ok 7554 1726853193.79692: done checking to see if all hosts have failed 7554 1726853193.79693: getting the remaining hosts for this loop 7554 1726853193.79695: done getting the remaining hosts for this loop 7554 1726853193.79698: getting the next task for host managed_node3 7554 1726853193.79703: done getting next task for host managed_node3 7554 1726853193.79707: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7554 1726853193.79710: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853193.79727: getting variables 7554 1726853193.79728: in VariableManager get_vars() 7554 1726853193.79766: Calling all_inventory to load vars for managed_node3 7554 1726853193.79767: Calling groups_inventory to load vars for managed_node3 7554 1726853193.79769: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853193.79777: Calling all_plugins_play to load vars for managed_node3 7554 1726853193.79779: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853193.79781: Calling groups_plugins_play to load vars for managed_node3 7554 1726853193.80663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853193.81522: done with get_vars() 7554 1726853193.81544: done getting variables 7554 1726853193.81589: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:26:33 -0400 (0:00:00.032) 0:00:47.783 ****** 7554 1726853193.81612: entering _queue_task() for managed_node3/copy 7554 1726853193.81865: worker is 1 (out of 1 available) 7554 1726853193.81879: exiting _queue_task() for managed_node3/copy 7554 1726853193.81892: done queuing things up, now waiting for results queue to drain 7554 1726853193.81894: waiting for pending results... 7554 1726853193.82079: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7554 1726853193.82175: in run() - task 02083763-bbaf-bdc3-98b6-00000000011a 7554 1726853193.82190: variable 'ansible_search_path' from source: unknown 7554 1726853193.82193: variable 'ansible_search_path' from source: unknown 7554 1726853193.82224: calling self._execute() 7554 1726853193.82303: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853193.82308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853193.82317: variable 'omit' from source: magic vars 7554 1726853193.82608: variable 'ansible_distribution_major_version' from source: facts 7554 1726853193.82618: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853193.82699: variable 'network_provider' from source: set_fact 7554 1726853193.82703: Evaluated conditional (network_provider == "initscripts"): False 7554 1726853193.82706: when evaluation is False, skipping this task 7554 1726853193.82710: _execute() done 7554 1726853193.82713: dumping result to json 7554 1726853193.82715: done dumping result, returning 7554 1726853193.82725: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-bdc3-98b6-00000000011a] 7554 1726853193.82730: sending task result for task 02083763-bbaf-bdc3-98b6-00000000011a 7554 1726853193.82820: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000011a 7554 1726853193.82822: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7554 1726853193.82873: no more pending results, returning what we have 7554 1726853193.82876: results queue empty 7554 1726853193.82877: checking for any_errors_fatal 7554 1726853193.82884: done checking for any_errors_fatal 7554 1726853193.82884: checking for max_fail_percentage 7554 1726853193.82886: done checking for max_fail_percentage 7554 1726853193.82887: checking to see if all hosts have failed and the running result is not ok 7554 1726853193.82888: done checking to see if all hosts have failed 7554 1726853193.82888: getting the remaining hosts for this loop 7554 1726853193.82890: done getting the remaining hosts for this loop 7554 1726853193.82893: getting the next task for host managed_node3 7554 1726853193.82900: done getting next task for host managed_node3 7554 1726853193.82904: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7554 1726853193.82907: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853193.82928: getting variables 7554 1726853193.82930: in VariableManager get_vars() 7554 1726853193.82981: Calling all_inventory to load vars for managed_node3 7554 1726853193.82984: Calling groups_inventory to load vars for managed_node3 7554 1726853193.82986: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853193.82996: Calling all_plugins_play to load vars for managed_node3 7554 1726853193.82999: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853193.83001: Calling groups_plugins_play to load vars for managed_node3 7554 1726853193.83791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853193.84659: done with get_vars() 7554 1726853193.84678: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:26:33 -0400 (0:00:00.031) 0:00:47.814 ****** 7554 1726853193.84741: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7554 1726853193.84997: worker is 1 (out of 1 available) 7554 1726853193.85010: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7554 1726853193.85023: done queuing things up, now waiting for results queue to drain 7554 1726853193.85025: waiting for pending results... 7554 1726853193.85212: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7554 1726853193.85308: in run() - task 02083763-bbaf-bdc3-98b6-00000000011b 7554 1726853193.85321: variable 'ansible_search_path' from source: unknown 7554 1726853193.85324: variable 'ansible_search_path' from source: unknown 7554 1726853193.85361: calling self._execute() 7554 1726853193.85436: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853193.85440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853193.85451: variable 'omit' from source: magic vars 7554 1726853193.85733: variable 'ansible_distribution_major_version' from source: facts 7554 1726853193.85747: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853193.85750: variable 'omit' from source: magic vars 7554 1726853193.85791: variable 'omit' from source: magic vars 7554 1726853193.85914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853193.87597: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853193.87637: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853193.87668: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853193.87693: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853193.87713: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853193.87772: variable 'network_provider' from source: set_fact 7554 1726853193.87866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853193.87889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853193.87906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853193.87931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853193.87941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853193.87996: variable 'omit' from source: magic vars 7554 1726853193.88074: variable 'omit' from source: magic vars 7554 1726853193.88146: variable 'network_connections' from source: task vars 7554 1726853193.88153: variable 'interface' from source: play vars 7554 1726853193.88199: variable 'interface' from source: play vars 7554 1726853193.88301: variable 'omit' from source: magic vars 7554 1726853193.88310: variable '__lsr_ansible_managed' from source: task vars 7554 1726853193.88350: variable '__lsr_ansible_managed' from source: task vars 7554 1726853193.88537: Loaded config def from plugin (lookup/template) 7554 1726853193.88541: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7554 1726853193.88564: File lookup term: get_ansible_managed.j2 7554 1726853193.88567: variable 'ansible_search_path' from source: unknown 7554 1726853193.88570: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7554 1726853193.88583: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7554 1726853193.88596: variable 'ansible_search_path' from source: unknown 7554 1726853193.91808: variable 'ansible_managed' from source: unknown 7554 1726853193.91885: variable 'omit' from source: magic vars 7554 1726853193.91907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853193.91927: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853193.91945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853193.91956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853193.91965: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853193.91987: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853193.91990: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853193.91992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853193.92056: Set connection var ansible_shell_executable to /bin/sh 7554 1726853193.92063: Set connection var ansible_pipelining to False 7554 1726853193.92066: Set connection var ansible_shell_type to sh 7554 1726853193.92068: Set connection var ansible_connection to ssh 7554 1726853193.92077: Set connection var ansible_timeout to 10 7554 1726853193.92082: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853193.92098: variable 'ansible_shell_executable' from source: unknown 7554 1726853193.92101: variable 'ansible_connection' from source: unknown 7554 1726853193.92105: variable 'ansible_module_compression' from source: unknown 7554 1726853193.92107: variable 'ansible_shell_type' from source: unknown 7554 1726853193.92110: variable 'ansible_shell_executable' from source: unknown 7554 1726853193.92112: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853193.92114: variable 'ansible_pipelining' from source: unknown 7554 1726853193.92116: variable 'ansible_timeout' from source: unknown 7554 1726853193.92125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853193.92212: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853193.92232: variable 'omit' from source: magic vars 7554 1726853193.92238: starting attempt loop 7554 1726853193.92241: running the handler 7554 1726853193.92246: _low_level_execute_command(): starting 7554 1726853193.92248: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853193.92733: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853193.92748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853193.92751: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853193.92767: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853193.92772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853193.92814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853193.92828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853193.92906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853193.94614: stdout chunk (state=3): >>>/root <<< 7554 1726853193.94712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853193.94741: stderr chunk (state=3): >>><<< 7554 1726853193.94747: stdout chunk (state=3): >>><<< 7554 1726853193.94764: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853193.94777: _low_level_execute_command(): starting 7554 1726853193.94783: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853193.9476526-9312-108010681482848 `" && echo ansible-tmp-1726853193.9476526-9312-108010681482848="` echo /root/.ansible/tmp/ansible-tmp-1726853193.9476526-9312-108010681482848 `" ) && sleep 0' 7554 1726853193.95232: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853193.95235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853193.95237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853193.95239: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853193.95242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853193.95298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853193.95301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853193.95305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853193.95366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853193.97327: stdout chunk (state=3): >>>ansible-tmp-1726853193.9476526-9312-108010681482848=/root/.ansible/tmp/ansible-tmp-1726853193.9476526-9312-108010681482848 <<< 7554 1726853193.97440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853193.97468: stderr chunk (state=3): >>><<< 7554 1726853193.97473: stdout chunk (state=3): >>><<< 7554 1726853193.97490: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853193.9476526-9312-108010681482848=/root/.ansible/tmp/ansible-tmp-1726853193.9476526-9312-108010681482848 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853193.97532: variable 'ansible_module_compression' from source: unknown 7554 1726853193.97570: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 7554 1726853193.97614: variable 'ansible_facts' from source: unknown 7554 1726853193.97710: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853193.9476526-9312-108010681482848/AnsiballZ_network_connections.py 7554 1726853193.97813: Sending initial data 7554 1726853193.97816: Sent initial data (166 bytes) 7554 1726853193.98248: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853193.98251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853193.98282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853193.98285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853193.98287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853193.98349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853193.98352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853193.98354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853193.98415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853194.00050: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 7554 1726853194.00053: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853194.00108: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853194.00169: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmppd9oa8e3 /root/.ansible/tmp/ansible-tmp-1726853193.9476526-9312-108010681482848/AnsiballZ_network_connections.py <<< 7554 1726853194.00174: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853193.9476526-9312-108010681482848/AnsiballZ_network_connections.py" <<< 7554 1726853194.00230: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmppd9oa8e3" to remote "/root/.ansible/tmp/ansible-tmp-1726853193.9476526-9312-108010681482848/AnsiballZ_network_connections.py" <<< 7554 1726853194.00234: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853193.9476526-9312-108010681482848/AnsiballZ_network_connections.py" <<< 7554 1726853194.01064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853194.01108: stderr chunk (state=3): >>><<< 7554 1726853194.01111: stdout chunk (state=3): >>><<< 7554 1726853194.01134: done transferring module to remote 7554 1726853194.01143: _low_level_execute_command(): starting 7554 1726853194.01150: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853193.9476526-9312-108010681482848/ /root/.ansible/tmp/ansible-tmp-1726853193.9476526-9312-108010681482848/AnsiballZ_network_connections.py && sleep 0' 7554 1726853194.01604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853194.01607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853194.01609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853194.01611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853194.01613: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853194.01615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853194.01674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853194.01677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853194.01731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853194.03600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853194.03627: stderr chunk (state=3): >>><<< 7554 1726853194.03631: stdout chunk (state=3): >>><<< 7554 1726853194.03646: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853194.03650: _low_level_execute_command(): starting 7554 1726853194.03652: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853193.9476526-9312-108010681482848/AnsiballZ_network_connections.py && sleep 0' 7554 1726853194.04070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853194.04095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853194.04098: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853194.04101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853194.04106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853194.04155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853194.04160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853194.04164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853194.04228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853194.36453: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_17h1titc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_17h1titc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/bae577e5-e110-4880-a74b-012e4e387e44: error=unknown <<< 7554 1726853194.36634: stdout chunk (state=3): >>> <<< 7554 1726853194.36647: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7554 1726853194.38532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853194.38546: stderr chunk (state=3): >>><<< 7554 1726853194.38555: stdout chunk (state=3): >>><<< 7554 1726853194.38833: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_17h1titc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_17h1titc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/bae577e5-e110-4880-a74b-012e4e387e44: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853194.38836: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853193.9476526-9312-108010681482848/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853194.38839: _low_level_execute_command(): starting 7554 1726853194.38841: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853193.9476526-9312-108010681482848/ > /dev/null 2>&1 && sleep 0' 7554 1726853194.39432: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853194.39448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853194.39468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853194.39489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853194.39505: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853194.39585: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853194.39588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853194.39634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853194.39657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853194.39796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853194.41735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853194.41759: stdout chunk (state=3): >>><<< 7554 1726853194.41769: stderr chunk (state=3): >>><<< 7554 1726853194.41790: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853194.41800: handler run complete 7554 1726853194.41832: attempt loop complete, returning result 7554 1726853194.41838: _execute() done 7554 1726853194.41843: dumping result to json 7554 1726853194.41852: done dumping result, returning 7554 1726853194.41872: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-bdc3-98b6-00000000011b] 7554 1726853194.41881: sending task result for task 02083763-bbaf-bdc3-98b6-00000000011b 7554 1726853194.42041: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000011b 7554 1726853194.42044: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 7554 1726853194.42147: no more pending results, returning what we have 7554 1726853194.42151: results queue empty 7554 1726853194.42152: checking for any_errors_fatal 7554 1726853194.42159: done checking for any_errors_fatal 7554 1726853194.42160: checking for max_fail_percentage 7554 1726853194.42161: done checking for max_fail_percentage 7554 1726853194.42162: checking to see if all hosts have failed and the running result is not ok 7554 1726853194.42163: done checking to see if all hosts have failed 7554 1726853194.42164: getting the remaining hosts for this loop 7554 1726853194.42165: done getting the remaining hosts for this loop 7554 1726853194.42169: getting the next task for host managed_node3 7554 1726853194.42178: done getting next task for host managed_node3 7554 1726853194.42182: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7554 1726853194.42185: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853194.42196: getting variables 7554 1726853194.42198: in VariableManager get_vars() 7554 1726853194.42248: Calling all_inventory to load vars for managed_node3 7554 1726853194.42251: Calling groups_inventory to load vars for managed_node3 7554 1726853194.42254: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853194.42264: Calling all_plugins_play to load vars for managed_node3 7554 1726853194.42267: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853194.42270: Calling groups_plugins_play to load vars for managed_node3 7554 1726853194.44152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853194.46032: done with get_vars() 7554 1726853194.46068: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:26:34 -0400 (0:00:00.614) 0:00:48.429 ****** 7554 1726853194.46155: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7554 1726853194.46617: worker is 1 (out of 1 available) 7554 1726853194.46629: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7554 1726853194.46639: done queuing things up, now waiting for results queue to drain 7554 1726853194.46641: waiting for pending results... 7554 1726853194.46946: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7554 1726853194.47042: in run() - task 02083763-bbaf-bdc3-98b6-00000000011c 7554 1726853194.47046: variable 'ansible_search_path' from source: unknown 7554 1726853194.47077: variable 'ansible_search_path' from source: unknown 7554 1726853194.47105: calling self._execute() 7554 1726853194.47219: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853194.47257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853194.47261: variable 'omit' from source: magic vars 7554 1726853194.47630: variable 'ansible_distribution_major_version' from source: facts 7554 1726853194.47653: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853194.47801: variable 'network_state' from source: role '' defaults 7554 1726853194.47804: Evaluated conditional (network_state != {}): False 7554 1726853194.47807: when evaluation is False, skipping this task 7554 1726853194.47809: _execute() done 7554 1726853194.47811: dumping result to json 7554 1726853194.47812: done dumping result, returning 7554 1726853194.47823: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-bdc3-98b6-00000000011c] 7554 1726853194.47834: sending task result for task 02083763-bbaf-bdc3-98b6-00000000011c skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7554 1726853194.48076: no more pending results, returning what we have 7554 1726853194.48080: results queue empty 7554 1726853194.48081: checking for any_errors_fatal 7554 1726853194.48091: done checking for any_errors_fatal 7554 1726853194.48092: checking for max_fail_percentage 7554 1726853194.48094: done checking for max_fail_percentage 7554 1726853194.48095: checking to see if all hosts have failed and the running result is not ok 7554 1726853194.48096: done checking to see if all hosts have failed 7554 1726853194.48096: getting the remaining hosts for this loop 7554 1726853194.48098: done getting the remaining hosts for this loop 7554 1726853194.48102: getting the next task for host managed_node3 7554 1726853194.48108: done getting next task for host managed_node3 7554 1726853194.48112: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7554 1726853194.48115: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853194.48140: getting variables 7554 1726853194.48142: in VariableManager get_vars() 7554 1726853194.48194: Calling all_inventory to load vars for managed_node3 7554 1726853194.48197: Calling groups_inventory to load vars for managed_node3 7554 1726853194.48200: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853194.48213: Calling all_plugins_play to load vars for managed_node3 7554 1726853194.48217: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853194.48220: Calling groups_plugins_play to load vars for managed_node3 7554 1726853194.48916: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000011c 7554 1726853194.48924: WORKER PROCESS EXITING 7554 1726853194.49951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853194.51723: done with get_vars() 7554 1726853194.51749: done getting variables 7554 1726853194.51810: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:26:34 -0400 (0:00:00.056) 0:00:48.485 ****** 7554 1726853194.51844: entering _queue_task() for managed_node3/debug 7554 1726853194.52306: worker is 1 (out of 1 available) 7554 1726853194.52316: exiting _queue_task() for managed_node3/debug 7554 1726853194.52327: done queuing things up, now waiting for results queue to drain 7554 1726853194.52328: waiting for pending results... 7554 1726853194.52525: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7554 1726853194.52692: in run() - task 02083763-bbaf-bdc3-98b6-00000000011d 7554 1726853194.52719: variable 'ansible_search_path' from source: unknown 7554 1726853194.52728: variable 'ansible_search_path' from source: unknown 7554 1726853194.52766: calling self._execute() 7554 1726853194.52865: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853194.52880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853194.52893: variable 'omit' from source: magic vars 7554 1726853194.53314: variable 'ansible_distribution_major_version' from source: facts 7554 1726853194.53334: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853194.53347: variable 'omit' from source: magic vars 7554 1726853194.53413: variable 'omit' from source: magic vars 7554 1726853194.53460: variable 'omit' from source: magic vars 7554 1726853194.53530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853194.53559: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853194.53593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853194.53639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853194.53643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853194.53678: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853194.53686: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853194.53748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853194.53818: Set connection var ansible_shell_executable to /bin/sh 7554 1726853194.53834: Set connection var ansible_pipelining to False 7554 1726853194.53841: Set connection var ansible_shell_type to sh 7554 1726853194.53847: Set connection var ansible_connection to ssh 7554 1726853194.53867: Set connection var ansible_timeout to 10 7554 1726853194.53879: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853194.53907: variable 'ansible_shell_executable' from source: unknown 7554 1726853194.53922: variable 'ansible_connection' from source: unknown 7554 1726853194.53930: variable 'ansible_module_compression' from source: unknown 7554 1726853194.53965: variable 'ansible_shell_type' from source: unknown 7554 1726853194.53968: variable 'ansible_shell_executable' from source: unknown 7554 1726853194.53970: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853194.53974: variable 'ansible_pipelining' from source: unknown 7554 1726853194.53976: variable 'ansible_timeout' from source: unknown 7554 1726853194.53978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853194.54131: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853194.54177: variable 'omit' from source: magic vars 7554 1726853194.54180: starting attempt loop 7554 1726853194.54183: running the handler 7554 1726853194.54307: variable '__network_connections_result' from source: set_fact 7554 1726853194.54369: handler run complete 7554 1726853194.54399: attempt loop complete, returning result 7554 1726853194.54460: _execute() done 7554 1726853194.54463: dumping result to json 7554 1726853194.54465: done dumping result, returning 7554 1726853194.54468: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-bdc3-98b6-00000000011d] 7554 1726853194.54470: sending task result for task 02083763-bbaf-bdc3-98b6-00000000011d 7554 1726853194.54546: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000011d 7554 1726853194.54550: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 7554 1726853194.54642: no more pending results, returning what we have 7554 1726853194.54645: results queue empty 7554 1726853194.54646: checking for any_errors_fatal 7554 1726853194.54653: done checking for any_errors_fatal 7554 1726853194.54654: checking for max_fail_percentage 7554 1726853194.54656: done checking for max_fail_percentage 7554 1726853194.54657: checking to see if all hosts have failed and the running result is not ok 7554 1726853194.54659: done checking to see if all hosts have failed 7554 1726853194.54659: getting the remaining hosts for this loop 7554 1726853194.54661: done getting the remaining hosts for this loop 7554 1726853194.54665: getting the next task for host managed_node3 7554 1726853194.54674: done getting next task for host managed_node3 7554 1726853194.54678: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7554 1726853194.54681: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853194.54695: getting variables 7554 1726853194.54697: in VariableManager get_vars() 7554 1726853194.54747: Calling all_inventory to load vars for managed_node3 7554 1726853194.54750: Calling groups_inventory to load vars for managed_node3 7554 1726853194.54752: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853194.54763: Calling all_plugins_play to load vars for managed_node3 7554 1726853194.54766: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853194.54769: Calling groups_plugins_play to load vars for managed_node3 7554 1726853194.57777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853194.60654: done with get_vars() 7554 1726853194.60691: done getting variables 7554 1726853194.60756: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:26:34 -0400 (0:00:00.089) 0:00:48.575 ****** 7554 1726853194.60797: entering _queue_task() for managed_node3/debug 7554 1726853194.61272: worker is 1 (out of 1 available) 7554 1726853194.61283: exiting _queue_task() for managed_node3/debug 7554 1726853194.61294: done queuing things up, now waiting for results queue to drain 7554 1726853194.61295: waiting for pending results... 7554 1726853194.61593: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7554 1726853194.61653: in run() - task 02083763-bbaf-bdc3-98b6-00000000011e 7554 1726853194.61776: variable 'ansible_search_path' from source: unknown 7554 1726853194.61781: variable 'ansible_search_path' from source: unknown 7554 1726853194.61784: calling self._execute() 7554 1726853194.61835: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853194.61846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853194.61860: variable 'omit' from source: magic vars 7554 1726853194.62273: variable 'ansible_distribution_major_version' from source: facts 7554 1726853194.62294: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853194.62339: variable 'omit' from source: magic vars 7554 1726853194.62381: variable 'omit' from source: magic vars 7554 1726853194.62426: variable 'omit' from source: magic vars 7554 1726853194.62483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853194.62522: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853194.62556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853194.62665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853194.62668: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853194.62672: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853194.62675: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853194.62678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853194.62746: Set connection var ansible_shell_executable to /bin/sh 7554 1726853194.62762: Set connection var ansible_pipelining to False 7554 1726853194.62782: Set connection var ansible_shell_type to sh 7554 1726853194.62790: Set connection var ansible_connection to ssh 7554 1726853194.62806: Set connection var ansible_timeout to 10 7554 1726853194.62817: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853194.62845: variable 'ansible_shell_executable' from source: unknown 7554 1726853194.62855: variable 'ansible_connection' from source: unknown 7554 1726853194.62863: variable 'ansible_module_compression' from source: unknown 7554 1726853194.62873: variable 'ansible_shell_type' from source: unknown 7554 1726853194.62888: variable 'ansible_shell_executable' from source: unknown 7554 1726853194.62894: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853194.62902: variable 'ansible_pipelining' from source: unknown 7554 1726853194.62908: variable 'ansible_timeout' from source: unknown 7554 1726853194.62914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853194.63063: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853194.63100: variable 'omit' from source: magic vars 7554 1726853194.63103: starting attempt loop 7554 1726853194.63177: running the handler 7554 1726853194.63181: variable '__network_connections_result' from source: set_fact 7554 1726853194.63256: variable '__network_connections_result' from source: set_fact 7554 1726853194.63384: handler run complete 7554 1726853194.63412: attempt loop complete, returning result 7554 1726853194.63424: _execute() done 7554 1726853194.63434: dumping result to json 7554 1726853194.63442: done dumping result, returning 7554 1726853194.63454: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-bdc3-98b6-00000000011e] 7554 1726853194.63464: sending task result for task 02083763-bbaf-bdc3-98b6-00000000011e 7554 1726853194.63721: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000011e 7554 1726853194.63724: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 7554 1726853194.63819: no more pending results, returning what we have 7554 1726853194.63823: results queue empty 7554 1726853194.63824: checking for any_errors_fatal 7554 1726853194.63831: done checking for any_errors_fatal 7554 1726853194.63831: checking for max_fail_percentage 7554 1726853194.63838: done checking for max_fail_percentage 7554 1726853194.63839: checking to see if all hosts have failed and the running result is not ok 7554 1726853194.63840: done checking to see if all hosts have failed 7554 1726853194.63841: getting the remaining hosts for this loop 7554 1726853194.63842: done getting the remaining hosts for this loop 7554 1726853194.63846: getting the next task for host managed_node3 7554 1726853194.63853: done getting next task for host managed_node3 7554 1726853194.63857: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7554 1726853194.63860: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853194.63874: getting variables 7554 1726853194.63876: in VariableManager get_vars() 7554 1726853194.63924: Calling all_inventory to load vars for managed_node3 7554 1726853194.63927: Calling groups_inventory to load vars for managed_node3 7554 1726853194.63929: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853194.63940: Calling all_plugins_play to load vars for managed_node3 7554 1726853194.64058: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853194.64064: Calling groups_plugins_play to load vars for managed_node3 7554 1726853194.65700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853194.67244: done with get_vars() 7554 1726853194.67270: done getting variables 7554 1726853194.67331: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:26:34 -0400 (0:00:00.065) 0:00:48.641 ****** 7554 1726853194.67368: entering _queue_task() for managed_node3/debug 7554 1726853194.67901: worker is 1 (out of 1 available) 7554 1726853194.67911: exiting _queue_task() for managed_node3/debug 7554 1726853194.67922: done queuing things up, now waiting for results queue to drain 7554 1726853194.67923: waiting for pending results... 7554 1726853194.68091: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7554 1726853194.68215: in run() - task 02083763-bbaf-bdc3-98b6-00000000011f 7554 1726853194.68240: variable 'ansible_search_path' from source: unknown 7554 1726853194.68259: variable 'ansible_search_path' from source: unknown 7554 1726853194.68368: calling self._execute() 7554 1726853194.68403: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853194.68416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853194.68433: variable 'omit' from source: magic vars 7554 1726853194.68819: variable 'ansible_distribution_major_version' from source: facts 7554 1726853194.68837: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853194.68963: variable 'network_state' from source: role '' defaults 7554 1726853194.68981: Evaluated conditional (network_state != {}): False 7554 1726853194.68989: when evaluation is False, skipping this task 7554 1726853194.68997: _execute() done 7554 1726853194.69003: dumping result to json 7554 1726853194.69010: done dumping result, returning 7554 1726853194.69031: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-bdc3-98b6-00000000011f] 7554 1726853194.69043: sending task result for task 02083763-bbaf-bdc3-98b6-00000000011f skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7554 1726853194.69307: no more pending results, returning what we have 7554 1726853194.69312: results queue empty 7554 1726853194.69313: checking for any_errors_fatal 7554 1726853194.69323: done checking for any_errors_fatal 7554 1726853194.69324: checking for max_fail_percentage 7554 1726853194.69326: done checking for max_fail_percentage 7554 1726853194.69327: checking to see if all hosts have failed and the running result is not ok 7554 1726853194.69328: done checking to see if all hosts have failed 7554 1726853194.69329: getting the remaining hosts for this loop 7554 1726853194.69330: done getting the remaining hosts for this loop 7554 1726853194.69335: getting the next task for host managed_node3 7554 1726853194.69341: done getting next task for host managed_node3 7554 1726853194.69345: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7554 1726853194.69349: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853194.69376: getting variables 7554 1726853194.69378: in VariableManager get_vars() 7554 1726853194.69429: Calling all_inventory to load vars for managed_node3 7554 1726853194.69432: Calling groups_inventory to load vars for managed_node3 7554 1726853194.69435: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853194.69448: Calling all_plugins_play to load vars for managed_node3 7554 1726853194.69452: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853194.69455: Calling groups_plugins_play to load vars for managed_node3 7554 1726853194.69985: done sending task result for task 02083763-bbaf-bdc3-98b6-00000000011f 7554 1726853194.69988: WORKER PROCESS EXITING 7554 1726853194.71044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853194.71962: done with get_vars() 7554 1726853194.71984: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:26:34 -0400 (0:00:00.046) 0:00:48.688 ****** 7554 1726853194.72052: entering _queue_task() for managed_node3/ping 7554 1726853194.72306: worker is 1 (out of 1 available) 7554 1726853194.72319: exiting _queue_task() for managed_node3/ping 7554 1726853194.72332: done queuing things up, now waiting for results queue to drain 7554 1726853194.72333: waiting for pending results... 7554 1726853194.72524: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7554 1726853194.72626: in run() - task 02083763-bbaf-bdc3-98b6-000000000120 7554 1726853194.72638: variable 'ansible_search_path' from source: unknown 7554 1726853194.72641: variable 'ansible_search_path' from source: unknown 7554 1726853194.72675: calling self._execute() 7554 1726853194.72757: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853194.72760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853194.72770: variable 'omit' from source: magic vars 7554 1726853194.73058: variable 'ansible_distribution_major_version' from source: facts 7554 1726853194.73079: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853194.73082: variable 'omit' from source: magic vars 7554 1726853194.73276: variable 'omit' from source: magic vars 7554 1726853194.73279: variable 'omit' from source: magic vars 7554 1726853194.73282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853194.73285: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853194.73289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853194.73301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853194.73318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853194.73354: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853194.73365: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853194.73377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853194.73883: Set connection var ansible_shell_executable to /bin/sh 7554 1726853194.73886: Set connection var ansible_pipelining to False 7554 1726853194.73888: Set connection var ansible_shell_type to sh 7554 1726853194.73890: Set connection var ansible_connection to ssh 7554 1726853194.73893: Set connection var ansible_timeout to 10 7554 1726853194.73895: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853194.73897: variable 'ansible_shell_executable' from source: unknown 7554 1726853194.73898: variable 'ansible_connection' from source: unknown 7554 1726853194.73901: variable 'ansible_module_compression' from source: unknown 7554 1726853194.73904: variable 'ansible_shell_type' from source: unknown 7554 1726853194.73907: variable 'ansible_shell_executable' from source: unknown 7554 1726853194.73909: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853194.73912: variable 'ansible_pipelining' from source: unknown 7554 1726853194.73914: variable 'ansible_timeout' from source: unknown 7554 1726853194.73916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853194.74377: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7554 1726853194.74381: variable 'omit' from source: magic vars 7554 1726853194.74384: starting attempt loop 7554 1726853194.74386: running the handler 7554 1726853194.74388: _low_level_execute_command(): starting 7554 1726853194.74390: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853194.75106: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853194.75123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853194.75135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853194.75178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853194.75199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853194.75261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853194.77315: stdout chunk (state=3): >>>/root <<< 7554 1726853194.77319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853194.77321: stdout chunk (state=3): >>><<< 7554 1726853194.77323: stderr chunk (state=3): >>><<< 7554 1726853194.77326: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853194.77328: _low_level_execute_command(): starting 7554 1726853194.77330: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853194.7722638-9346-189776781579343 `" && echo ansible-tmp-1726853194.7722638-9346-189776781579343="` echo /root/.ansible/tmp/ansible-tmp-1726853194.7722638-9346-189776781579343 `" ) && sleep 0' 7554 1726853194.78242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853194.78248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853194.78251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853194.78260: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853194.78262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853194.78423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853194.78491: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853194.78575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853194.80593: stdout chunk (state=3): >>>ansible-tmp-1726853194.7722638-9346-189776781579343=/root/.ansible/tmp/ansible-tmp-1726853194.7722638-9346-189776781579343 <<< 7554 1726853194.80734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853194.80805: stdout chunk (state=3): >>><<< 7554 1726853194.80808: stderr chunk (state=3): >>><<< 7554 1726853194.80826: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853194.7722638-9346-189776781579343=/root/.ansible/tmp/ansible-tmp-1726853194.7722638-9346-189776781579343 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853194.80886: variable 'ansible_module_compression' from source: unknown 7554 1726853194.81010: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 7554 1726853194.81249: variable 'ansible_facts' from source: unknown 7554 1726853194.81252: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853194.7722638-9346-189776781579343/AnsiballZ_ping.py 7554 1726853194.81593: Sending initial data 7554 1726853194.81603: Sent initial data (151 bytes) 7554 1726853194.82155: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853194.82177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853194.82344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853194.82547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853194.82694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853194.84326: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853194.84376: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853194.84445: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp_ug_5j53 /root/.ansible/tmp/ansible-tmp-1726853194.7722638-9346-189776781579343/AnsiballZ_ping.py <<< 7554 1726853194.84448: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853194.7722638-9346-189776781579343/AnsiballZ_ping.py" <<< 7554 1726853194.84495: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp_ug_5j53" to remote "/root/.ansible/tmp/ansible-tmp-1726853194.7722638-9346-189776781579343/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853194.7722638-9346-189776781579343/AnsiballZ_ping.py" <<< 7554 1726853194.85594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853194.85598: stdout chunk (state=3): >>><<< 7554 1726853194.85600: stderr chunk (state=3): >>><<< 7554 1726853194.85602: done transferring module to remote 7554 1726853194.85604: _low_level_execute_command(): starting 7554 1726853194.85607: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853194.7722638-9346-189776781579343/ /root/.ansible/tmp/ansible-tmp-1726853194.7722638-9346-189776781579343/AnsiballZ_ping.py && sleep 0' 7554 1726853194.86127: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853194.86145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853194.86164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853194.86185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853194.86288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853194.86389: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853194.86548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853194.86583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853194.88490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853194.88512: stderr chunk (state=3): >>><<< 7554 1726853194.88521: stdout chunk (state=3): >>><<< 7554 1726853194.88552: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853194.88560: _low_level_execute_command(): starting 7554 1726853194.88568: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853194.7722638-9346-189776781579343/AnsiballZ_ping.py && sleep 0' 7554 1726853194.89176: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853194.89192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853194.89208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853194.89233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853194.89291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853194.89358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853194.89378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853194.89402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853194.89500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853195.04786: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7554 1726853195.06610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853195.06614: stdout chunk (state=3): >>><<< 7554 1726853195.06616: stderr chunk (state=3): >>><<< 7554 1726853195.06619: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853195.06623: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853194.7722638-9346-189776781579343/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853195.06625: _low_level_execute_command(): starting 7554 1726853195.06627: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853194.7722638-9346-189776781579343/ > /dev/null 2>&1 && sleep 0' 7554 1726853195.07855: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853195.07872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853195.08052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853195.08104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853195.10046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853195.10057: stdout chunk (state=3): >>><<< 7554 1726853195.10069: stderr chunk (state=3): >>><<< 7554 1726853195.10094: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853195.10112: handler run complete 7554 1726853195.10130: attempt loop complete, returning result 7554 1726853195.10137: _execute() done 7554 1726853195.10146: dumping result to json 7554 1726853195.10157: done dumping result, returning 7554 1726853195.10170: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-bdc3-98b6-000000000120] 7554 1726853195.10276: sending task result for task 02083763-bbaf-bdc3-98b6-000000000120 7554 1726853195.10353: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000120 7554 1726853195.10357: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7554 1726853195.10424: no more pending results, returning what we have 7554 1726853195.10427: results queue empty 7554 1726853195.10428: checking for any_errors_fatal 7554 1726853195.10436: done checking for any_errors_fatal 7554 1726853195.10437: checking for max_fail_percentage 7554 1726853195.10439: done checking for max_fail_percentage 7554 1726853195.10439: checking to see if all hosts have failed and the running result is not ok 7554 1726853195.10441: done checking to see if all hosts have failed 7554 1726853195.10441: getting the remaining hosts for this loop 7554 1726853195.10443: done getting the remaining hosts for this loop 7554 1726853195.10446: getting the next task for host managed_node3 7554 1726853195.10454: done getting next task for host managed_node3 7554 1726853195.10457: ^ task is: TASK: meta (role_complete) 7554 1726853195.10459: ^ state is: HOST STATE: block=2, task=38, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853195.10673: getting variables 7554 1726853195.10675: in VariableManager get_vars() 7554 1726853195.10719: Calling all_inventory to load vars for managed_node3 7554 1726853195.10722: Calling groups_inventory to load vars for managed_node3 7554 1726853195.10724: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853195.10741: Calling all_plugins_play to load vars for managed_node3 7554 1726853195.10748: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853195.10752: Calling groups_plugins_play to load vars for managed_node3 7554 1726853195.20900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853195.22792: done with get_vars() 7554 1726853195.22884: done getting variables 7554 1726853195.22961: done queuing things up, now waiting for results queue to drain 7554 1726853195.22963: results queue empty 7554 1726853195.22964: checking for any_errors_fatal 7554 1726853195.22968: done checking for any_errors_fatal 7554 1726853195.22968: checking for max_fail_percentage 7554 1726853195.22970: done checking for max_fail_percentage 7554 1726853195.22972: checking to see if all hosts have failed and the running result is not ok 7554 1726853195.22974: done checking to see if all hosts have failed 7554 1726853195.22975: getting the remaining hosts for this loop 7554 1726853195.22976: done getting the remaining hosts for this loop 7554 1726853195.22979: getting the next task for host managed_node3 7554 1726853195.22983: done getting next task for host managed_node3 7554 1726853195.22985: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7554 1726853195.22987: ^ state is: HOST STATE: block=2, task=39, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853195.22989: getting variables 7554 1726853195.22990: in VariableManager get_vars() 7554 1726853195.23009: Calling all_inventory to load vars for managed_node3 7554 1726853195.23012: Calling groups_inventory to load vars for managed_node3 7554 1726853195.23014: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853195.23019: Calling all_plugins_play to load vars for managed_node3 7554 1726853195.23022: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853195.23025: Calling groups_plugins_play to load vars for managed_node3 7554 1726853195.24228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853195.26005: done with get_vars() 7554 1726853195.26039: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:145 Friday 20 September 2024 13:26:35 -0400 (0:00:00.540) 0:00:49.228 ****** 7554 1726853195.26111: entering _queue_task() for managed_node3/include_tasks 7554 1726853195.26540: worker is 1 (out of 1 available) 7554 1726853195.26553: exiting _queue_task() for managed_node3/include_tasks 7554 1726853195.26705: done queuing things up, now waiting for results queue to drain 7554 1726853195.26710: waiting for pending results... 7554 1726853195.26951: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7554 1726853195.27179: in run() - task 02083763-bbaf-bdc3-98b6-000000000150 7554 1726853195.27183: variable 'ansible_search_path' from source: unknown 7554 1726853195.27185: calling self._execute() 7554 1726853195.27233: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853195.27240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853195.27249: variable 'omit' from source: magic vars 7554 1726853195.27662: variable 'ansible_distribution_major_version' from source: facts 7554 1726853195.27675: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853195.27683: _execute() done 7554 1726853195.27687: dumping result to json 7554 1726853195.27690: done dumping result, returning 7554 1726853195.27698: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [02083763-bbaf-bdc3-98b6-000000000150] 7554 1726853195.27703: sending task result for task 02083763-bbaf-bdc3-98b6-000000000150 7554 1726853195.27805: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000150 7554 1726853195.27809: WORKER PROCESS EXITING 7554 1726853195.27840: no more pending results, returning what we have 7554 1726853195.27846: in VariableManager get_vars() 7554 1726853195.28019: Calling all_inventory to load vars for managed_node3 7554 1726853195.28022: Calling groups_inventory to load vars for managed_node3 7554 1726853195.28024: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853195.28036: Calling all_plugins_play to load vars for managed_node3 7554 1726853195.28040: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853195.28043: Calling groups_plugins_play to load vars for managed_node3 7554 1726853195.29689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853195.31306: done with get_vars() 7554 1726853195.31337: variable 'ansible_search_path' from source: unknown 7554 1726853195.31354: we have included files to process 7554 1726853195.31356: generating all_blocks data 7554 1726853195.31358: done generating all_blocks data 7554 1726853195.31363: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7554 1726853195.31365: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7554 1726853195.31367: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7554 1726853195.31787: in VariableManager get_vars() 7554 1726853195.31818: done with get_vars() 7554 1726853195.32485: done processing included file 7554 1726853195.32488: iterating over new_blocks loaded from include file 7554 1726853195.32489: in VariableManager get_vars() 7554 1726853195.32518: done with get_vars() 7554 1726853195.32521: filtering new block on tags 7554 1726853195.32553: done filtering new block on tags 7554 1726853195.32556: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7554 1726853195.32562: extending task lists for all hosts with included blocks 7554 1726853195.38246: done extending task lists 7554 1726853195.38248: done processing included files 7554 1726853195.38249: results queue empty 7554 1726853195.38254: checking for any_errors_fatal 7554 1726853195.38256: done checking for any_errors_fatal 7554 1726853195.38257: checking for max_fail_percentage 7554 1726853195.38258: done checking for max_fail_percentage 7554 1726853195.38259: checking to see if all hosts have failed and the running result is not ok 7554 1726853195.38260: done checking to see if all hosts have failed 7554 1726853195.38261: getting the remaining hosts for this loop 7554 1726853195.38262: done getting the remaining hosts for this loop 7554 1726853195.38265: getting the next task for host managed_node3 7554 1726853195.38270: done getting next task for host managed_node3 7554 1726853195.38273: ^ task is: TASK: Ensure state in ["present", "absent"] 7554 1726853195.38276: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853195.38279: getting variables 7554 1726853195.38280: in VariableManager get_vars() 7554 1726853195.38303: Calling all_inventory to load vars for managed_node3 7554 1726853195.38306: Calling groups_inventory to load vars for managed_node3 7554 1726853195.38308: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853195.38315: Calling all_plugins_play to load vars for managed_node3 7554 1726853195.38317: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853195.38320: Calling groups_plugins_play to load vars for managed_node3 7554 1726853195.39663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853195.41223: done with get_vars() 7554 1726853195.41249: done getting variables 7554 1726853195.41305: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 13:26:35 -0400 (0:00:00.152) 0:00:49.380 ****** 7554 1726853195.41335: entering _queue_task() for managed_node3/fail 7554 1726853195.41714: worker is 1 (out of 1 available) 7554 1726853195.41728: exiting _queue_task() for managed_node3/fail 7554 1726853195.41742: done queuing things up, now waiting for results queue to drain 7554 1726853195.41744: waiting for pending results... 7554 1726853195.42304: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7554 1726853195.42579: in run() - task 02083763-bbaf-bdc3-98b6-000000001a6f 7554 1726853195.42584: variable 'ansible_search_path' from source: unknown 7554 1726853195.42588: variable 'ansible_search_path' from source: unknown 7554 1726853195.42596: calling self._execute() 7554 1726853195.42599: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853195.42602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853195.42604: variable 'omit' from source: magic vars 7554 1726853195.42729: variable 'ansible_distribution_major_version' from source: facts 7554 1726853195.42950: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853195.42954: variable 'state' from source: include params 7554 1726853195.42956: Evaluated conditional (state not in ["present", "absent"]): False 7554 1726853195.42958: when evaluation is False, skipping this task 7554 1726853195.42960: _execute() done 7554 1726853195.42962: dumping result to json 7554 1726853195.42964: done dumping result, returning 7554 1726853195.42966: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [02083763-bbaf-bdc3-98b6-000000001a6f] 7554 1726853195.42967: sending task result for task 02083763-bbaf-bdc3-98b6-000000001a6f skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7554 1726853195.43080: no more pending results, returning what we have 7554 1726853195.43086: results queue empty 7554 1726853195.43087: checking for any_errors_fatal 7554 1726853195.43089: done checking for any_errors_fatal 7554 1726853195.43089: checking for max_fail_percentage 7554 1726853195.43091: done checking for max_fail_percentage 7554 1726853195.43092: checking to see if all hosts have failed and the running result is not ok 7554 1726853195.43093: done checking to see if all hosts have failed 7554 1726853195.43094: getting the remaining hosts for this loop 7554 1726853195.43095: done getting the remaining hosts for this loop 7554 1726853195.43099: getting the next task for host managed_node3 7554 1726853195.43106: done getting next task for host managed_node3 7554 1726853195.43109: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7554 1726853195.43112: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853195.43117: getting variables 7554 1726853195.43118: in VariableManager get_vars() 7554 1726853195.43175: Calling all_inventory to load vars for managed_node3 7554 1726853195.43178: Calling groups_inventory to load vars for managed_node3 7554 1726853195.43181: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853195.43196: Calling all_plugins_play to load vars for managed_node3 7554 1726853195.43200: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853195.43203: Calling groups_plugins_play to load vars for managed_node3 7554 1726853195.43722: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001a6f 7554 1726853195.43726: WORKER PROCESS EXITING 7554 1726853195.44773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853195.46358: done with get_vars() 7554 1726853195.46393: done getting variables 7554 1726853195.46461: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 13:26:35 -0400 (0:00:00.051) 0:00:49.432 ****** 7554 1726853195.46497: entering _queue_task() for managed_node3/fail 7554 1726853195.46886: worker is 1 (out of 1 available) 7554 1726853195.46900: exiting _queue_task() for managed_node3/fail 7554 1726853195.46914: done queuing things up, now waiting for results queue to drain 7554 1726853195.46916: waiting for pending results... 7554 1726853195.47320: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7554 1726853195.47349: in run() - task 02083763-bbaf-bdc3-98b6-000000001a70 7554 1726853195.47378: variable 'ansible_search_path' from source: unknown 7554 1726853195.47387: variable 'ansible_search_path' from source: unknown 7554 1726853195.47433: calling self._execute() 7554 1726853195.47568: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853195.47583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853195.47599: variable 'omit' from source: magic vars 7554 1726853195.48033: variable 'ansible_distribution_major_version' from source: facts 7554 1726853195.48055: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853195.48211: variable 'type' from source: play vars 7554 1726853195.48225: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7554 1726853195.48233: when evaluation is False, skipping this task 7554 1726853195.48241: _execute() done 7554 1726853195.48278: dumping result to json 7554 1726853195.48281: done dumping result, returning 7554 1726853195.48283: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [02083763-bbaf-bdc3-98b6-000000001a70] 7554 1726853195.48285: sending task result for task 02083763-bbaf-bdc3-98b6-000000001a70 skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7554 1726853195.48438: no more pending results, returning what we have 7554 1726853195.48445: results queue empty 7554 1726853195.48446: checking for any_errors_fatal 7554 1726853195.48453: done checking for any_errors_fatal 7554 1726853195.48454: checking for max_fail_percentage 7554 1726853195.48456: done checking for max_fail_percentage 7554 1726853195.48457: checking to see if all hosts have failed and the running result is not ok 7554 1726853195.48459: done checking to see if all hosts have failed 7554 1726853195.48460: getting the remaining hosts for this loop 7554 1726853195.48461: done getting the remaining hosts for this loop 7554 1726853195.48465: getting the next task for host managed_node3 7554 1726853195.48474: done getting next task for host managed_node3 7554 1726853195.48477: ^ task is: TASK: Include the task 'show_interfaces.yml' 7554 1726853195.48481: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853195.48486: getting variables 7554 1726853195.48488: in VariableManager get_vars() 7554 1726853195.48548: Calling all_inventory to load vars for managed_node3 7554 1726853195.48551: Calling groups_inventory to load vars for managed_node3 7554 1726853195.48554: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853195.48567: Calling all_plugins_play to load vars for managed_node3 7554 1726853195.48875: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853195.48882: Calling groups_plugins_play to load vars for managed_node3 7554 1726853195.49585: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001a70 7554 1726853195.49588: WORKER PROCESS EXITING 7554 1726853195.50380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853195.52197: done with get_vars() 7554 1726853195.52231: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 13:26:35 -0400 (0:00:00.058) 0:00:49.490 ****** 7554 1726853195.52325: entering _queue_task() for managed_node3/include_tasks 7554 1726853195.52694: worker is 1 (out of 1 available) 7554 1726853195.52707: exiting _queue_task() for managed_node3/include_tasks 7554 1726853195.52721: done queuing things up, now waiting for results queue to drain 7554 1726853195.52722: waiting for pending results... 7554 1726853195.52931: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7554 1726853195.53050: in run() - task 02083763-bbaf-bdc3-98b6-000000001a71 7554 1726853195.53078: variable 'ansible_search_path' from source: unknown 7554 1726853195.53088: variable 'ansible_search_path' from source: unknown 7554 1726853195.53132: calling self._execute() 7554 1726853195.53248: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853195.53261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853195.53278: variable 'omit' from source: magic vars 7554 1726853195.53666: variable 'ansible_distribution_major_version' from source: facts 7554 1726853195.53685: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853195.53695: _execute() done 7554 1726853195.53702: dumping result to json 7554 1726853195.53709: done dumping result, returning 7554 1726853195.53718: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-bdc3-98b6-000000001a71] 7554 1726853195.53727: sending task result for task 02083763-bbaf-bdc3-98b6-000000001a71 7554 1726853195.53832: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001a71 7554 1726853195.53840: WORKER PROCESS EXITING 7554 1726853195.53885: no more pending results, returning what we have 7554 1726853195.53890: in VariableManager get_vars() 7554 1726853195.53946: Calling all_inventory to load vars for managed_node3 7554 1726853195.53949: Calling groups_inventory to load vars for managed_node3 7554 1726853195.53951: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853195.54077: Calling all_plugins_play to load vars for managed_node3 7554 1726853195.54081: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853195.54084: Calling groups_plugins_play to load vars for managed_node3 7554 1726853195.55388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853195.57316: done with get_vars() 7554 1726853195.57339: variable 'ansible_search_path' from source: unknown 7554 1726853195.57340: variable 'ansible_search_path' from source: unknown 7554 1726853195.57389: we have included files to process 7554 1726853195.57391: generating all_blocks data 7554 1726853195.57393: done generating all_blocks data 7554 1726853195.57398: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7554 1726853195.57400: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7554 1726853195.57402: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7554 1726853195.57521: in VariableManager get_vars() 7554 1726853195.57554: done with get_vars() 7554 1726853195.57678: done processing included file 7554 1726853195.57681: iterating over new_blocks loaded from include file 7554 1726853195.57682: in VariableManager get_vars() 7554 1726853195.57709: done with get_vars() 7554 1726853195.57712: filtering new block on tags 7554 1726853195.57730: done filtering new block on tags 7554 1726853195.57732: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7554 1726853195.57738: extending task lists for all hosts with included blocks 7554 1726853195.58150: done extending task lists 7554 1726853195.58151: done processing included files 7554 1726853195.58152: results queue empty 7554 1726853195.58153: checking for any_errors_fatal 7554 1726853195.58156: done checking for any_errors_fatal 7554 1726853195.58157: checking for max_fail_percentage 7554 1726853195.58158: done checking for max_fail_percentage 7554 1726853195.58159: checking to see if all hosts have failed and the running result is not ok 7554 1726853195.58160: done checking to see if all hosts have failed 7554 1726853195.58160: getting the remaining hosts for this loop 7554 1726853195.58161: done getting the remaining hosts for this loop 7554 1726853195.58164: getting the next task for host managed_node3 7554 1726853195.58167: done getting next task for host managed_node3 7554 1726853195.58170: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7554 1726853195.58174: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853195.58177: getting variables 7554 1726853195.58178: in VariableManager get_vars() 7554 1726853195.58194: Calling all_inventory to load vars for managed_node3 7554 1726853195.58197: Calling groups_inventory to load vars for managed_node3 7554 1726853195.58199: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853195.58204: Calling all_plugins_play to load vars for managed_node3 7554 1726853195.58206: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853195.58208: Calling groups_plugins_play to load vars for managed_node3 7554 1726853195.59432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853195.61158: done with get_vars() 7554 1726853195.61183: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:26:35 -0400 (0:00:00.089) 0:00:49.579 ****** 7554 1726853195.61240: entering _queue_task() for managed_node3/include_tasks 7554 1726853195.61515: worker is 1 (out of 1 available) 7554 1726853195.61531: exiting _queue_task() for managed_node3/include_tasks 7554 1726853195.61546: done queuing things up, now waiting for results queue to drain 7554 1726853195.61548: waiting for pending results... 7554 1726853195.61730: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7554 1726853195.61818: in run() - task 02083763-bbaf-bdc3-98b6-000000001d1c 7554 1726853195.61831: variable 'ansible_search_path' from source: unknown 7554 1726853195.61835: variable 'ansible_search_path' from source: unknown 7554 1726853195.61865: calling self._execute() 7554 1726853195.61940: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853195.61948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853195.61956: variable 'omit' from source: magic vars 7554 1726853195.62247: variable 'ansible_distribution_major_version' from source: facts 7554 1726853195.62256: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853195.62262: _execute() done 7554 1726853195.62264: dumping result to json 7554 1726853195.62268: done dumping result, returning 7554 1726853195.62276: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-bdc3-98b6-000000001d1c] 7554 1726853195.62281: sending task result for task 02083763-bbaf-bdc3-98b6-000000001d1c 7554 1726853195.62367: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001d1c 7554 1726853195.62370: WORKER PROCESS EXITING 7554 1726853195.62404: no more pending results, returning what we have 7554 1726853195.62409: in VariableManager get_vars() 7554 1726853195.62465: Calling all_inventory to load vars for managed_node3 7554 1726853195.62468: Calling groups_inventory to load vars for managed_node3 7554 1726853195.62470: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853195.62485: Calling all_plugins_play to load vars for managed_node3 7554 1726853195.62489: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853195.62491: Calling groups_plugins_play to load vars for managed_node3 7554 1726853195.63691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853195.65480: done with get_vars() 7554 1726853195.65507: variable 'ansible_search_path' from source: unknown 7554 1726853195.65509: variable 'ansible_search_path' from source: unknown 7554 1726853195.65578: we have included files to process 7554 1726853195.65579: generating all_blocks data 7554 1726853195.65581: done generating all_blocks data 7554 1726853195.65583: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7554 1726853195.65584: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7554 1726853195.65587: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7554 1726853195.65884: done processing included file 7554 1726853195.65887: iterating over new_blocks loaded from include file 7554 1726853195.65889: in VariableManager get_vars() 7554 1726853195.65918: done with get_vars() 7554 1726853195.65920: filtering new block on tags 7554 1726853195.65938: done filtering new block on tags 7554 1726853195.65940: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7554 1726853195.65946: extending task lists for all hosts with included blocks 7554 1726853195.66278: done extending task lists 7554 1726853195.66280: done processing included files 7554 1726853195.66282: results queue empty 7554 1726853195.66283: checking for any_errors_fatal 7554 1726853195.66286: done checking for any_errors_fatal 7554 1726853195.66287: checking for max_fail_percentage 7554 1726853195.66288: done checking for max_fail_percentage 7554 1726853195.66289: checking to see if all hosts have failed and the running result is not ok 7554 1726853195.66290: done checking to see if all hosts have failed 7554 1726853195.66290: getting the remaining hosts for this loop 7554 1726853195.66292: done getting the remaining hosts for this loop 7554 1726853195.66294: getting the next task for host managed_node3 7554 1726853195.66299: done getting next task for host managed_node3 7554 1726853195.66301: ^ task is: TASK: Gather current interface info 7554 1726853195.66305: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853195.66308: getting variables 7554 1726853195.66309: in VariableManager get_vars() 7554 1726853195.66329: Calling all_inventory to load vars for managed_node3 7554 1726853195.66331: Calling groups_inventory to load vars for managed_node3 7554 1726853195.66333: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853195.66339: Calling all_plugins_play to load vars for managed_node3 7554 1726853195.66342: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853195.66344: Calling groups_plugins_play to load vars for managed_node3 7554 1726853195.68041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853195.69687: done with get_vars() 7554 1726853195.69711: done getting variables 7554 1726853195.69748: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:26:35 -0400 (0:00:00.085) 0:00:49.665 ****** 7554 1726853195.69775: entering _queue_task() for managed_node3/command 7554 1726853195.70041: worker is 1 (out of 1 available) 7554 1726853195.70055: exiting _queue_task() for managed_node3/command 7554 1726853195.70068: done queuing things up, now waiting for results queue to drain 7554 1726853195.70070: waiting for pending results... 7554 1726853195.70262: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7554 1726853195.70353: in run() - task 02083763-bbaf-bdc3-98b6-000000001d53 7554 1726853195.70365: variable 'ansible_search_path' from source: unknown 7554 1726853195.70369: variable 'ansible_search_path' from source: unknown 7554 1726853195.70400: calling self._execute() 7554 1726853195.70504: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853195.70526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853195.70530: variable 'omit' from source: magic vars 7554 1726853195.70930: variable 'ansible_distribution_major_version' from source: facts 7554 1726853195.70935: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853195.70942: variable 'omit' from source: magic vars 7554 1726853195.71000: variable 'omit' from source: magic vars 7554 1726853195.71039: variable 'omit' from source: magic vars 7554 1726853195.71083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853195.71117: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853195.71141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853195.71160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853195.71173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853195.71202: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853195.71205: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853195.71208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853195.71316: Set connection var ansible_shell_executable to /bin/sh 7554 1726853195.71324: Set connection var ansible_pipelining to False 7554 1726853195.71327: Set connection var ansible_shell_type to sh 7554 1726853195.71329: Set connection var ansible_connection to ssh 7554 1726853195.71339: Set connection var ansible_timeout to 10 7554 1726853195.71356: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853195.71375: variable 'ansible_shell_executable' from source: unknown 7554 1726853195.71378: variable 'ansible_connection' from source: unknown 7554 1726853195.71381: variable 'ansible_module_compression' from source: unknown 7554 1726853195.71384: variable 'ansible_shell_type' from source: unknown 7554 1726853195.71387: variable 'ansible_shell_executable' from source: unknown 7554 1726853195.71389: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853195.71393: variable 'ansible_pipelining' from source: unknown 7554 1726853195.71395: variable 'ansible_timeout' from source: unknown 7554 1726853195.71399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853195.71577: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853195.71750: variable 'omit' from source: magic vars 7554 1726853195.71753: starting attempt loop 7554 1726853195.71756: running the handler 7554 1726853195.71758: _low_level_execute_command(): starting 7554 1726853195.71760: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853195.72361: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853195.72385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853195.72397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853195.72411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853195.72443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853195.72446: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853195.72456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853195.72468: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853195.72493: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853195.72514: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7554 1726853195.72518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853195.72520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853195.72560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853195.72577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853195.72629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853195.72638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853195.72642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853195.72701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853195.74680: stdout chunk (state=3): >>>/root <<< 7554 1726853195.74683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853195.74686: stdout chunk (state=3): >>><<< 7554 1726853195.74689: stderr chunk (state=3): >>><<< 7554 1726853195.74692: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853195.74694: _low_level_execute_command(): starting 7554 1726853195.74697: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853195.745861-9390-67696184921738 `" && echo ansible-tmp-1726853195.745861-9390-67696184921738="` echo /root/.ansible/tmp/ansible-tmp-1726853195.745861-9390-67696184921738 `" ) && sleep 0' 7554 1726853195.75165: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853195.75175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853195.75185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853195.75200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853195.75212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853195.75221: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853195.75228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853195.75243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853195.75297: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853195.75301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853195.75304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853195.75340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853195.75408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853195.77361: stdout chunk (state=3): >>>ansible-tmp-1726853195.745861-9390-67696184921738=/root/.ansible/tmp/ansible-tmp-1726853195.745861-9390-67696184921738 <<< 7554 1726853195.77462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853195.77493: stderr chunk (state=3): >>><<< 7554 1726853195.77497: stdout chunk (state=3): >>><<< 7554 1726853195.77517: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853195.745861-9390-67696184921738=/root/.ansible/tmp/ansible-tmp-1726853195.745861-9390-67696184921738 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853195.77543: variable 'ansible_module_compression' from source: unknown 7554 1726853195.77590: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853195.77623: variable 'ansible_facts' from source: unknown 7554 1726853195.77675: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853195.745861-9390-67696184921738/AnsiballZ_command.py 7554 1726853195.77781: Sending initial data 7554 1726853195.77785: Sent initial data (152 bytes) 7554 1726853195.78229: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853195.78270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853195.78275: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853195.78278: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853195.78280: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853195.78282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853195.78284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853195.78328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853195.78331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853195.78333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853195.78401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853195.80013: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 7554 1726853195.80017: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853195.80069: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853195.80129: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpvbewcr2a /root/.ansible/tmp/ansible-tmp-1726853195.745861-9390-67696184921738/AnsiballZ_command.py <<< 7554 1726853195.80132: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853195.745861-9390-67696184921738/AnsiballZ_command.py" <<< 7554 1726853195.80184: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpvbewcr2a" to remote "/root/.ansible/tmp/ansible-tmp-1726853195.745861-9390-67696184921738/AnsiballZ_command.py" <<< 7554 1726853195.80189: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853195.745861-9390-67696184921738/AnsiballZ_command.py" <<< 7554 1726853195.80783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853195.80827: stderr chunk (state=3): >>><<< 7554 1726853195.80830: stdout chunk (state=3): >>><<< 7554 1726853195.80876: done transferring module to remote 7554 1726853195.80885: _low_level_execute_command(): starting 7554 1726853195.80890: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853195.745861-9390-67696184921738/ /root/.ansible/tmp/ansible-tmp-1726853195.745861-9390-67696184921738/AnsiballZ_command.py && sleep 0' 7554 1726853195.81346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853195.81349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853195.81352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853195.81354: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853195.81356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853195.81415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853195.81423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853195.81425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853195.81474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853195.83324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853195.83355: stderr chunk (state=3): >>><<< 7554 1726853195.83358: stdout chunk (state=3): >>><<< 7554 1726853195.83370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853195.83374: _low_level_execute_command(): starting 7554 1726853195.83379: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853195.745861-9390-67696184921738/AnsiballZ_command.py && sleep 0' 7554 1726853195.83817: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853195.83821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853195.83824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 7554 1726853195.83826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853195.83828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853195.83876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853195.83880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853195.83883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853195.83951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853195.99699: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:26:35.991990", "end": "2024-09-20 13:26:35.995517", "delta": "0:00:00.003527", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853196.01485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853196.01489: stdout chunk (state=3): >>><<< 7554 1726853196.01491: stderr chunk (state=3): >>><<< 7554 1726853196.01495: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:26:35.991990", "end": "2024-09-20 13:26:35.995517", "delta": "0:00:00.003527", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853196.01577: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853195.745861-9390-67696184921738/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853196.01695: _low_level_execute_command(): starting 7554 1726853196.01698: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853195.745861-9390-67696184921738/ > /dev/null 2>&1 && sleep 0' 7554 1726853196.02359: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853196.02532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853196.02535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853196.02607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853196.02666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853196.04533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853196.04602: stderr chunk (state=3): >>><<< 7554 1726853196.04605: stdout chunk (state=3): >>><<< 7554 1726853196.04626: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853196.04739: handler run complete 7554 1726853196.04745: Evaluated conditional (False): False 7554 1726853196.04748: attempt loop complete, returning result 7554 1726853196.04749: _execute() done 7554 1726853196.04751: dumping result to json 7554 1726853196.04753: done dumping result, returning 7554 1726853196.04754: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [02083763-bbaf-bdc3-98b6-000000001d53] 7554 1726853196.04756: sending task result for task 02083763-bbaf-bdc3-98b6-000000001d53 7554 1726853196.04823: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001d53 7554 1726853196.04825: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003527", "end": "2024-09-20 13:26:35.995517", "rc": 0, "start": "2024-09-20 13:26:35.991990" } STDOUT: eth0 lo peerveth0 veth0 7554 1726853196.04898: no more pending results, returning what we have 7554 1726853196.04901: results queue empty 7554 1726853196.04902: checking for any_errors_fatal 7554 1726853196.04904: done checking for any_errors_fatal 7554 1726853196.04904: checking for max_fail_percentage 7554 1726853196.04906: done checking for max_fail_percentage 7554 1726853196.04907: checking to see if all hosts have failed and the running result is not ok 7554 1726853196.04908: done checking to see if all hosts have failed 7554 1726853196.04908: getting the remaining hosts for this loop 7554 1726853196.04910: done getting the remaining hosts for this loop 7554 1726853196.04913: getting the next task for host managed_node3 7554 1726853196.04920: done getting next task for host managed_node3 7554 1726853196.04922: ^ task is: TASK: Set current_interfaces 7554 1726853196.04928: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853196.04932: getting variables 7554 1726853196.04934: in VariableManager get_vars() 7554 1726853196.05132: Calling all_inventory to load vars for managed_node3 7554 1726853196.05135: Calling groups_inventory to load vars for managed_node3 7554 1726853196.05138: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853196.05148: Calling all_plugins_play to load vars for managed_node3 7554 1726853196.05150: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853196.05153: Calling groups_plugins_play to load vars for managed_node3 7554 1726853196.06299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853196.07173: done with get_vars() 7554 1726853196.07191: done getting variables 7554 1726853196.07234: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:26:36 -0400 (0:00:00.374) 0:00:50.040 ****** 7554 1726853196.07261: entering _queue_task() for managed_node3/set_fact 7554 1726853196.07526: worker is 1 (out of 1 available) 7554 1726853196.07540: exiting _queue_task() for managed_node3/set_fact 7554 1726853196.07556: done queuing things up, now waiting for results queue to drain 7554 1726853196.07557: waiting for pending results... 7554 1726853196.07753: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7554 1726853196.07885: in run() - task 02083763-bbaf-bdc3-98b6-000000001d54 7554 1726853196.07890: variable 'ansible_search_path' from source: unknown 7554 1726853196.07893: variable 'ansible_search_path' from source: unknown 7554 1726853196.07941: calling self._execute() 7554 1726853196.08198: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.08202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.08205: variable 'omit' from source: magic vars 7554 1726853196.08391: variable 'ansible_distribution_major_version' from source: facts 7554 1726853196.08402: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853196.08408: variable 'omit' from source: magic vars 7554 1726853196.08456: variable 'omit' from source: magic vars 7554 1726853196.08558: variable '_current_interfaces' from source: set_fact 7554 1726853196.08634: variable 'omit' from source: magic vars 7554 1726853196.08726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853196.08730: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853196.08768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853196.08794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853196.08810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853196.08852: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853196.08859: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.08878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.08992: Set connection var ansible_shell_executable to /bin/sh 7554 1726853196.09052: Set connection var ansible_pipelining to False 7554 1726853196.09055: Set connection var ansible_shell_type to sh 7554 1726853196.09057: Set connection var ansible_connection to ssh 7554 1726853196.09059: Set connection var ansible_timeout to 10 7554 1726853196.09061: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853196.09063: variable 'ansible_shell_executable' from source: unknown 7554 1726853196.09070: variable 'ansible_connection' from source: unknown 7554 1726853196.09087: variable 'ansible_module_compression' from source: unknown 7554 1726853196.09095: variable 'ansible_shell_type' from source: unknown 7554 1726853196.09102: variable 'ansible_shell_executable' from source: unknown 7554 1726853196.09108: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.09116: variable 'ansible_pipelining' from source: unknown 7554 1726853196.09122: variable 'ansible_timeout' from source: unknown 7554 1726853196.09138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.09346: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853196.09354: variable 'omit' from source: magic vars 7554 1726853196.09359: starting attempt loop 7554 1726853196.09362: running the handler 7554 1726853196.09376: handler run complete 7554 1726853196.09383: attempt loop complete, returning result 7554 1726853196.09386: _execute() done 7554 1726853196.09388: dumping result to json 7554 1726853196.09391: done dumping result, returning 7554 1726853196.09398: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [02083763-bbaf-bdc3-98b6-000000001d54] 7554 1726853196.09407: sending task result for task 02083763-bbaf-bdc3-98b6-000000001d54 7554 1726853196.09505: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001d54 7554 1726853196.09508: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo", "peerveth0", "veth0" ] }, "changed": false } 7554 1726853196.09567: no more pending results, returning what we have 7554 1726853196.09573: results queue empty 7554 1726853196.09574: checking for any_errors_fatal 7554 1726853196.09584: done checking for any_errors_fatal 7554 1726853196.09585: checking for max_fail_percentage 7554 1726853196.09586: done checking for max_fail_percentage 7554 1726853196.09588: checking to see if all hosts have failed and the running result is not ok 7554 1726853196.09589: done checking to see if all hosts have failed 7554 1726853196.09590: getting the remaining hosts for this loop 7554 1726853196.09591: done getting the remaining hosts for this loop 7554 1726853196.09595: getting the next task for host managed_node3 7554 1726853196.09601: done getting next task for host managed_node3 7554 1726853196.09604: ^ task is: TASK: Show current_interfaces 7554 1726853196.09608: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853196.09612: getting variables 7554 1726853196.09613: in VariableManager get_vars() 7554 1726853196.09661: Calling all_inventory to load vars for managed_node3 7554 1726853196.09663: Calling groups_inventory to load vars for managed_node3 7554 1726853196.09666: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853196.09683: Calling all_plugins_play to load vars for managed_node3 7554 1726853196.09686: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853196.09689: Calling groups_plugins_play to load vars for managed_node3 7554 1726853196.10482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853196.11873: done with get_vars() 7554 1726853196.11894: done getting variables 7554 1726853196.11952: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:26:36 -0400 (0:00:00.047) 0:00:50.087 ****** 7554 1726853196.11987: entering _queue_task() for managed_node3/debug 7554 1726853196.12314: worker is 1 (out of 1 available) 7554 1726853196.12327: exiting _queue_task() for managed_node3/debug 7554 1726853196.12340: done queuing things up, now waiting for results queue to drain 7554 1726853196.12341: waiting for pending results... 7554 1726853196.12703: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7554 1726853196.12766: in run() - task 02083763-bbaf-bdc3-98b6-000000001d1d 7554 1726853196.12782: variable 'ansible_search_path' from source: unknown 7554 1726853196.12786: variable 'ansible_search_path' from source: unknown 7554 1726853196.12814: calling self._execute() 7554 1726853196.12897: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.12903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.12911: variable 'omit' from source: magic vars 7554 1726853196.13202: variable 'ansible_distribution_major_version' from source: facts 7554 1726853196.13213: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853196.13218: variable 'omit' from source: magic vars 7554 1726853196.13249: variable 'omit' from source: magic vars 7554 1726853196.13318: variable 'current_interfaces' from source: set_fact 7554 1726853196.13340: variable 'omit' from source: magic vars 7554 1726853196.13375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853196.13401: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853196.13420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853196.13433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853196.13443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853196.13473: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853196.13476: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.13479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.13550: Set connection var ansible_shell_executable to /bin/sh 7554 1726853196.13557: Set connection var ansible_pipelining to False 7554 1726853196.13560: Set connection var ansible_shell_type to sh 7554 1726853196.13562: Set connection var ansible_connection to ssh 7554 1726853196.13570: Set connection var ansible_timeout to 10 7554 1726853196.13576: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853196.13595: variable 'ansible_shell_executable' from source: unknown 7554 1726853196.13599: variable 'ansible_connection' from source: unknown 7554 1726853196.13601: variable 'ansible_module_compression' from source: unknown 7554 1726853196.13604: variable 'ansible_shell_type' from source: unknown 7554 1726853196.13606: variable 'ansible_shell_executable' from source: unknown 7554 1726853196.13608: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.13610: variable 'ansible_pipelining' from source: unknown 7554 1726853196.13614: variable 'ansible_timeout' from source: unknown 7554 1726853196.13616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.13717: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853196.13727: variable 'omit' from source: magic vars 7554 1726853196.13730: starting attempt loop 7554 1726853196.13733: running the handler 7554 1726853196.13774: handler run complete 7554 1726853196.13783: attempt loop complete, returning result 7554 1726853196.13786: _execute() done 7554 1726853196.13789: dumping result to json 7554 1726853196.13791: done dumping result, returning 7554 1726853196.13798: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [02083763-bbaf-bdc3-98b6-000000001d1d] 7554 1726853196.13802: sending task result for task 02083763-bbaf-bdc3-98b6-000000001d1d 7554 1726853196.13883: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001d1d 7554 1726853196.13886: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo', 'peerveth0', 'veth0'] 7554 1726853196.13930: no more pending results, returning what we have 7554 1726853196.13933: results queue empty 7554 1726853196.13933: checking for any_errors_fatal 7554 1726853196.13939: done checking for any_errors_fatal 7554 1726853196.13940: checking for max_fail_percentage 7554 1726853196.13941: done checking for max_fail_percentage 7554 1726853196.13942: checking to see if all hosts have failed and the running result is not ok 7554 1726853196.13943: done checking to see if all hosts have failed 7554 1726853196.13944: getting the remaining hosts for this loop 7554 1726853196.13945: done getting the remaining hosts for this loop 7554 1726853196.13949: getting the next task for host managed_node3 7554 1726853196.13956: done getting next task for host managed_node3 7554 1726853196.13959: ^ task is: TASK: Install iproute 7554 1726853196.13962: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853196.13967: getting variables 7554 1726853196.13969: in VariableManager get_vars() 7554 1726853196.14016: Calling all_inventory to load vars for managed_node3 7554 1726853196.14019: Calling groups_inventory to load vars for managed_node3 7554 1726853196.14021: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853196.14031: Calling all_plugins_play to load vars for managed_node3 7554 1726853196.14033: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853196.14035: Calling groups_plugins_play to load vars for managed_node3 7554 1726853196.15139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853196.16154: done with get_vars() 7554 1726853196.16173: done getting variables 7554 1726853196.16216: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 13:26:36 -0400 (0:00:00.042) 0:00:50.129 ****** 7554 1726853196.16240: entering _queue_task() for managed_node3/package 7554 1726853196.16477: worker is 1 (out of 1 available) 7554 1726853196.16490: exiting _queue_task() for managed_node3/package 7554 1726853196.16502: done queuing things up, now waiting for results queue to drain 7554 1726853196.16504: waiting for pending results... 7554 1726853196.16693: running TaskExecutor() for managed_node3/TASK: Install iproute 7554 1726853196.16768: in run() - task 02083763-bbaf-bdc3-98b6-000000001a72 7554 1726853196.16780: variable 'ansible_search_path' from source: unknown 7554 1726853196.16784: variable 'ansible_search_path' from source: unknown 7554 1726853196.16811: calling self._execute() 7554 1726853196.16892: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.16897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.16906: variable 'omit' from source: magic vars 7554 1726853196.17188: variable 'ansible_distribution_major_version' from source: facts 7554 1726853196.17198: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853196.17204: variable 'omit' from source: magic vars 7554 1726853196.17230: variable 'omit' from source: magic vars 7554 1726853196.17366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7554 1726853196.18822: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7554 1726853196.18868: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7554 1726853196.18899: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7554 1726853196.18925: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7554 1726853196.18945: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7554 1726853196.19020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7554 1726853196.19046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7554 1726853196.19065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7554 1726853196.19093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7554 1726853196.19103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7554 1726853196.19180: variable '__network_is_ostree' from source: set_fact 7554 1726853196.19184: variable 'omit' from source: magic vars 7554 1726853196.19207: variable 'omit' from source: magic vars 7554 1726853196.19234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853196.19254: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853196.19270: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853196.19284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853196.19294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853196.19318: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853196.19320: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.19323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.19396: Set connection var ansible_shell_executable to /bin/sh 7554 1726853196.19403: Set connection var ansible_pipelining to False 7554 1726853196.19406: Set connection var ansible_shell_type to sh 7554 1726853196.19409: Set connection var ansible_connection to ssh 7554 1726853196.19416: Set connection var ansible_timeout to 10 7554 1726853196.19421: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853196.19438: variable 'ansible_shell_executable' from source: unknown 7554 1726853196.19441: variable 'ansible_connection' from source: unknown 7554 1726853196.19444: variable 'ansible_module_compression' from source: unknown 7554 1726853196.19448: variable 'ansible_shell_type' from source: unknown 7554 1726853196.19450: variable 'ansible_shell_executable' from source: unknown 7554 1726853196.19460: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.19463: variable 'ansible_pipelining' from source: unknown 7554 1726853196.19465: variable 'ansible_timeout' from source: unknown 7554 1726853196.19467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.19533: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853196.19542: variable 'omit' from source: magic vars 7554 1726853196.19550: starting attempt loop 7554 1726853196.19553: running the handler 7554 1726853196.19564: variable 'ansible_facts' from source: unknown 7554 1726853196.19568: variable 'ansible_facts' from source: unknown 7554 1726853196.19595: _low_level_execute_command(): starting 7554 1726853196.19601: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853196.20107: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853196.20111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.20115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853196.20117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.20168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853196.20175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853196.20249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853196.21980: stdout chunk (state=3): >>>/root <<< 7554 1726853196.22085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853196.22123: stderr chunk (state=3): >>><<< 7554 1726853196.22125: stdout chunk (state=3): >>><<< 7554 1726853196.22139: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853196.22183: _low_level_execute_command(): starting 7554 1726853196.22187: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853196.2215238-9410-42895826395847 `" && echo ansible-tmp-1726853196.2215238-9410-42895826395847="` echo /root/.ansible/tmp/ansible-tmp-1726853196.2215238-9410-42895826395847 `" ) && sleep 0' 7554 1726853196.22624: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853196.22627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853196.22629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853196.22632: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853196.22634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.22677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853196.22681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853196.22694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853196.22751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853196.24700: stdout chunk (state=3): >>>ansible-tmp-1726853196.2215238-9410-42895826395847=/root/.ansible/tmp/ansible-tmp-1726853196.2215238-9410-42895826395847 <<< 7554 1726853196.24804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853196.24832: stderr chunk (state=3): >>><<< 7554 1726853196.24835: stdout chunk (state=3): >>><<< 7554 1726853196.24851: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853196.2215238-9410-42895826395847=/root/.ansible/tmp/ansible-tmp-1726853196.2215238-9410-42895826395847 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853196.24884: variable 'ansible_module_compression' from source: unknown 7554 1726853196.24927: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 7554 1726853196.24962: variable 'ansible_facts' from source: unknown 7554 1726853196.25048: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853196.2215238-9410-42895826395847/AnsiballZ_dnf.py 7554 1726853196.25150: Sending initial data 7554 1726853196.25154: Sent initial data (149 bytes) 7554 1726853196.25604: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853196.25607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853196.25609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.25611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853196.25614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.25665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853196.25668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853196.25734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853196.27310: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 7554 1726853196.27318: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853196.27372: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853196.27431: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpybs8myve /root/.ansible/tmp/ansible-tmp-1726853196.2215238-9410-42895826395847/AnsiballZ_dnf.py <<< 7554 1726853196.27433: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853196.2215238-9410-42895826395847/AnsiballZ_dnf.py" <<< 7554 1726853196.27487: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpybs8myve" to remote "/root/.ansible/tmp/ansible-tmp-1726853196.2215238-9410-42895826395847/AnsiballZ_dnf.py" <<< 7554 1726853196.27493: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853196.2215238-9410-42895826395847/AnsiballZ_dnf.py" <<< 7554 1726853196.28208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853196.28252: stderr chunk (state=3): >>><<< 7554 1726853196.28255: stdout chunk (state=3): >>><<< 7554 1726853196.28298: done transferring module to remote 7554 1726853196.28306: _low_level_execute_command(): starting 7554 1726853196.28313: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853196.2215238-9410-42895826395847/ /root/.ansible/tmp/ansible-tmp-1726853196.2215238-9410-42895826395847/AnsiballZ_dnf.py && sleep 0' 7554 1726853196.28728: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853196.28737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853196.28765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.28768: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853196.28778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853196.28783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.28828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853196.28834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853196.28836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853196.28897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853196.30718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853196.30740: stderr chunk (state=3): >>><<< 7554 1726853196.30743: stdout chunk (state=3): >>><<< 7554 1726853196.30759: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853196.30762: _low_level_execute_command(): starting 7554 1726853196.30766: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853196.2215238-9410-42895826395847/AnsiballZ_dnf.py && sleep 0' 7554 1726853196.31176: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853196.31206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853196.31209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853196.31211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.31214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853196.31215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.31265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853196.31268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853196.31337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853196.73907: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7554 1726853196.78294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853196.78319: stderr chunk (state=3): >>><<< 7554 1726853196.78322: stdout chunk (state=3): >>><<< 7554 1726853196.78339: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853196.78376: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853196.2215238-9410-42895826395847/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853196.78383: _low_level_execute_command(): starting 7554 1726853196.78388: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853196.2215238-9410-42895826395847/ > /dev/null 2>&1 && sleep 0' 7554 1726853196.78833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853196.78837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.78839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853196.78841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.78899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853196.78903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853196.78914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853196.78966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853196.80835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853196.80856: stderr chunk (state=3): >>><<< 7554 1726853196.80859: stdout chunk (state=3): >>><<< 7554 1726853196.80875: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853196.80882: handler run complete 7554 1726853196.80996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7554 1726853196.81127: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7554 1726853196.81160: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7554 1726853196.81187: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7554 1726853196.81494: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7554 1726853196.81551: variable '__install_status' from source: set_fact 7554 1726853196.81565: Evaluated conditional (__install_status is success): True 7554 1726853196.81578: attempt loop complete, returning result 7554 1726853196.81581: _execute() done 7554 1726853196.81584: dumping result to json 7554 1726853196.81590: done dumping result, returning 7554 1726853196.81598: done running TaskExecutor() for managed_node3/TASK: Install iproute [02083763-bbaf-bdc3-98b6-000000001a72] 7554 1726853196.81600: sending task result for task 02083763-bbaf-bdc3-98b6-000000001a72 7554 1726853196.81694: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001a72 7554 1726853196.81697: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7554 1726853196.81790: no more pending results, returning what we have 7554 1726853196.81793: results queue empty 7554 1726853196.81794: checking for any_errors_fatal 7554 1726853196.81800: done checking for any_errors_fatal 7554 1726853196.81801: checking for max_fail_percentage 7554 1726853196.81802: done checking for max_fail_percentage 7554 1726853196.81803: checking to see if all hosts have failed and the running result is not ok 7554 1726853196.81804: done checking to see if all hosts have failed 7554 1726853196.81805: getting the remaining hosts for this loop 7554 1726853196.81806: done getting the remaining hosts for this loop 7554 1726853196.81811: getting the next task for host managed_node3 7554 1726853196.81816: done getting next task for host managed_node3 7554 1726853196.81818: ^ task is: TASK: Create veth interface {{ interface }} 7554 1726853196.81821: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853196.81825: getting variables 7554 1726853196.81827: in VariableManager get_vars() 7554 1726853196.81880: Calling all_inventory to load vars for managed_node3 7554 1726853196.81883: Calling groups_inventory to load vars for managed_node3 7554 1726853196.81885: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853196.81895: Calling all_plugins_play to load vars for managed_node3 7554 1726853196.81898: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853196.81900: Calling groups_plugins_play to load vars for managed_node3 7554 1726853196.82832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853196.83674: done with get_vars() 7554 1726853196.83690: done getting variables 7554 1726853196.83734: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853196.83823: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 13:26:36 -0400 (0:00:00.676) 0:00:50.806 ****** 7554 1726853196.83848: entering _queue_task() for managed_node3/command 7554 1726853196.84090: worker is 1 (out of 1 available) 7554 1726853196.84104: exiting _queue_task() for managed_node3/command 7554 1726853196.84119: done queuing things up, now waiting for results queue to drain 7554 1726853196.84120: waiting for pending results... 7554 1726853196.84319: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7554 1726853196.84395: in run() - task 02083763-bbaf-bdc3-98b6-000000001a73 7554 1726853196.84409: variable 'ansible_search_path' from source: unknown 7554 1726853196.84413: variable 'ansible_search_path' from source: unknown 7554 1726853196.84627: variable 'interface' from source: play vars 7554 1726853196.84683: variable 'interface' from source: play vars 7554 1726853196.84735: variable 'interface' from source: play vars 7554 1726853196.84851: Loaded config def from plugin (lookup/items) 7554 1726853196.84857: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7554 1726853196.84883: variable 'omit' from source: magic vars 7554 1726853196.84985: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.84995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.85003: variable 'omit' from source: magic vars 7554 1726853196.85164: variable 'ansible_distribution_major_version' from source: facts 7554 1726853196.85168: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853196.85300: variable 'type' from source: play vars 7554 1726853196.85304: variable 'state' from source: include params 7554 1726853196.85307: variable 'interface' from source: play vars 7554 1726853196.85309: variable 'current_interfaces' from source: set_fact 7554 1726853196.85317: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7554 1726853196.85321: when evaluation is False, skipping this task 7554 1726853196.85343: variable 'item' from source: unknown 7554 1726853196.85397: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 7554 1726853196.85544: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.85547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.85550: variable 'omit' from source: magic vars 7554 1726853196.85613: variable 'ansible_distribution_major_version' from source: facts 7554 1726853196.85617: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853196.85737: variable 'type' from source: play vars 7554 1726853196.85740: variable 'state' from source: include params 7554 1726853196.85743: variable 'interface' from source: play vars 7554 1726853196.85749: variable 'current_interfaces' from source: set_fact 7554 1726853196.85755: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7554 1726853196.85757: when evaluation is False, skipping this task 7554 1726853196.85781: variable 'item' from source: unknown 7554 1726853196.85822: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 7554 1726853196.85892: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.85896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.85899: variable 'omit' from source: magic vars 7554 1726853196.85997: variable 'ansible_distribution_major_version' from source: facts 7554 1726853196.86000: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853196.86118: variable 'type' from source: play vars 7554 1726853196.86122: variable 'state' from source: include params 7554 1726853196.86124: variable 'interface' from source: play vars 7554 1726853196.86127: variable 'current_interfaces' from source: set_fact 7554 1726853196.86137: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7554 1726853196.86139: when evaluation is False, skipping this task 7554 1726853196.86156: variable 'item' from source: unknown 7554 1726853196.86198: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 7554 1726853196.86264: dumping result to json 7554 1726853196.86267: done dumping result, returning 7554 1726853196.86269: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [02083763-bbaf-bdc3-98b6-000000001a73] 7554 1726853196.86279: sending task result for task 02083763-bbaf-bdc3-98b6-000000001a73 7554 1726853196.86312: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001a73 7554 1726853196.86315: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false } MSG: All items skipped 7554 1726853196.86422: no more pending results, returning what we have 7554 1726853196.86425: results queue empty 7554 1726853196.86426: checking for any_errors_fatal 7554 1726853196.86431: done checking for any_errors_fatal 7554 1726853196.86432: checking for max_fail_percentage 7554 1726853196.86433: done checking for max_fail_percentage 7554 1726853196.86433: checking to see if all hosts have failed and the running result is not ok 7554 1726853196.86434: done checking to see if all hosts have failed 7554 1726853196.86435: getting the remaining hosts for this loop 7554 1726853196.86436: done getting the remaining hosts for this loop 7554 1726853196.86439: getting the next task for host managed_node3 7554 1726853196.86444: done getting next task for host managed_node3 7554 1726853196.86447: ^ task is: TASK: Set up veth as managed by NetworkManager 7554 1726853196.86449: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853196.86453: getting variables 7554 1726853196.86454: in VariableManager get_vars() 7554 1726853196.86502: Calling all_inventory to load vars for managed_node3 7554 1726853196.86505: Calling groups_inventory to load vars for managed_node3 7554 1726853196.86507: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853196.86515: Calling all_plugins_play to load vars for managed_node3 7554 1726853196.86518: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853196.86520: Calling groups_plugins_play to load vars for managed_node3 7554 1726853196.87274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853196.88127: done with get_vars() 7554 1726853196.88147: done getting variables 7554 1726853196.88191: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 13:26:36 -0400 (0:00:00.043) 0:00:50.849 ****** 7554 1726853196.88218: entering _queue_task() for managed_node3/command 7554 1726853196.88468: worker is 1 (out of 1 available) 7554 1726853196.88487: exiting _queue_task() for managed_node3/command 7554 1726853196.88500: done queuing things up, now waiting for results queue to drain 7554 1726853196.88501: waiting for pending results... 7554 1726853196.88691: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7554 1726853196.88773: in run() - task 02083763-bbaf-bdc3-98b6-000000001a74 7554 1726853196.88788: variable 'ansible_search_path' from source: unknown 7554 1726853196.88792: variable 'ansible_search_path' from source: unknown 7554 1726853196.88818: calling self._execute() 7554 1726853196.88904: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.88909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.88917: variable 'omit' from source: magic vars 7554 1726853196.89196: variable 'ansible_distribution_major_version' from source: facts 7554 1726853196.89206: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853196.89310: variable 'type' from source: play vars 7554 1726853196.89314: variable 'state' from source: include params 7554 1726853196.89320: Evaluated conditional (type == 'veth' and state == 'present'): False 7554 1726853196.89322: when evaluation is False, skipping this task 7554 1726853196.89325: _execute() done 7554 1726853196.89327: dumping result to json 7554 1726853196.89330: done dumping result, returning 7554 1726853196.89337: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [02083763-bbaf-bdc3-98b6-000000001a74] 7554 1726853196.89342: sending task result for task 02083763-bbaf-bdc3-98b6-000000001a74 7554 1726853196.89428: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001a74 7554 1726853196.89431: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 7554 1726853196.89476: no more pending results, returning what we have 7554 1726853196.89479: results queue empty 7554 1726853196.89480: checking for any_errors_fatal 7554 1726853196.89491: done checking for any_errors_fatal 7554 1726853196.89492: checking for max_fail_percentage 7554 1726853196.89493: done checking for max_fail_percentage 7554 1726853196.89494: checking to see if all hosts have failed and the running result is not ok 7554 1726853196.89495: done checking to see if all hosts have failed 7554 1726853196.89496: getting the remaining hosts for this loop 7554 1726853196.89497: done getting the remaining hosts for this loop 7554 1726853196.89501: getting the next task for host managed_node3 7554 1726853196.89506: done getting next task for host managed_node3 7554 1726853196.89508: ^ task is: TASK: Delete veth interface {{ interface }} 7554 1726853196.89512: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853196.89516: getting variables 7554 1726853196.89518: in VariableManager get_vars() 7554 1726853196.89565: Calling all_inventory to load vars for managed_node3 7554 1726853196.89568: Calling groups_inventory to load vars for managed_node3 7554 1726853196.89570: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853196.89584: Calling all_plugins_play to load vars for managed_node3 7554 1726853196.89587: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853196.89589: Calling groups_plugins_play to load vars for managed_node3 7554 1726853196.90507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853196.91348: done with get_vars() 7554 1726853196.91365: done getting variables 7554 1726853196.91411: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853196.91494: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 13:26:36 -0400 (0:00:00.032) 0:00:50.882 ****** 7554 1726853196.91519: entering _queue_task() for managed_node3/command 7554 1726853196.91772: worker is 1 (out of 1 available) 7554 1726853196.91787: exiting _queue_task() for managed_node3/command 7554 1726853196.91799: done queuing things up, now waiting for results queue to drain 7554 1726853196.91801: waiting for pending results... 7554 1726853196.92001: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7554 1726853196.92083: in run() - task 02083763-bbaf-bdc3-98b6-000000001a75 7554 1726853196.92094: variable 'ansible_search_path' from source: unknown 7554 1726853196.92098: variable 'ansible_search_path' from source: unknown 7554 1726853196.92135: calling self._execute() 7554 1726853196.92217: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.92227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.92234: variable 'omit' from source: magic vars 7554 1726853196.92509: variable 'ansible_distribution_major_version' from source: facts 7554 1726853196.92519: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853196.92649: variable 'type' from source: play vars 7554 1726853196.92662: variable 'state' from source: include params 7554 1726853196.92665: variable 'interface' from source: play vars 7554 1726853196.92668: variable 'current_interfaces' from source: set_fact 7554 1726853196.92678: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 7554 1726853196.92683: variable 'omit' from source: magic vars 7554 1726853196.92712: variable 'omit' from source: magic vars 7554 1726853196.92785: variable 'interface' from source: play vars 7554 1726853196.92801: variable 'omit' from source: magic vars 7554 1726853196.92835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853196.92865: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853196.92885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853196.92900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853196.92909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853196.92934: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853196.92937: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.92939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.93017: Set connection var ansible_shell_executable to /bin/sh 7554 1726853196.93025: Set connection var ansible_pipelining to False 7554 1726853196.93028: Set connection var ansible_shell_type to sh 7554 1726853196.93030: Set connection var ansible_connection to ssh 7554 1726853196.93037: Set connection var ansible_timeout to 10 7554 1726853196.93042: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853196.93062: variable 'ansible_shell_executable' from source: unknown 7554 1726853196.93065: variable 'ansible_connection' from source: unknown 7554 1726853196.93068: variable 'ansible_module_compression' from source: unknown 7554 1726853196.93070: variable 'ansible_shell_type' from source: unknown 7554 1726853196.93074: variable 'ansible_shell_executable' from source: unknown 7554 1726853196.93076: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853196.93078: variable 'ansible_pipelining' from source: unknown 7554 1726853196.93082: variable 'ansible_timeout' from source: unknown 7554 1726853196.93093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853196.93190: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853196.93202: variable 'omit' from source: magic vars 7554 1726853196.93205: starting attempt loop 7554 1726853196.93208: running the handler 7554 1726853196.93224: _low_level_execute_command(): starting 7554 1726853196.93231: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853196.93752: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853196.93757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.93761: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853196.93763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.93813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853196.93816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853196.93819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853196.93895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853196.95610: stdout chunk (state=3): >>>/root <<< 7554 1726853196.95713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853196.95739: stderr chunk (state=3): >>><<< 7554 1726853196.95745: stdout chunk (state=3): >>><<< 7554 1726853196.95764: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853196.95776: _low_level_execute_command(): starting 7554 1726853196.95782: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853196.957633-9442-170988189083942 `" && echo ansible-tmp-1726853196.957633-9442-170988189083942="` echo /root/.ansible/tmp/ansible-tmp-1726853196.957633-9442-170988189083942 `" ) && sleep 0' 7554 1726853196.96230: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853196.96243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853196.96246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853196.96248: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853196.96250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.96296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853196.96300: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853196.96305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853196.96367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853196.98336: stdout chunk (state=3): >>>ansible-tmp-1726853196.957633-9442-170988189083942=/root/.ansible/tmp/ansible-tmp-1726853196.957633-9442-170988189083942 <<< 7554 1726853196.98439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853196.98468: stderr chunk (state=3): >>><<< 7554 1726853196.98474: stdout chunk (state=3): >>><<< 7554 1726853196.98489: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853196.957633-9442-170988189083942=/root/.ansible/tmp/ansible-tmp-1726853196.957633-9442-170988189083942 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853196.98519: variable 'ansible_module_compression' from source: unknown 7554 1726853196.98560: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853196.98589: variable 'ansible_facts' from source: unknown 7554 1726853196.98649: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853196.957633-9442-170988189083942/AnsiballZ_command.py 7554 1726853196.98753: Sending initial data 7554 1726853196.98756: Sent initial data (153 bytes) 7554 1726853196.99205: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853196.99208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853196.99211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.99213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853196.99215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853196.99217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853196.99267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853196.99274: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853196.99281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853196.99333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853197.00937: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7554 1726853197.00941: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853197.00995: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853197.01052: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpk1qsn4s5 /root/.ansible/tmp/ansible-tmp-1726853196.957633-9442-170988189083942/AnsiballZ_command.py <<< 7554 1726853197.01058: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853196.957633-9442-170988189083942/AnsiballZ_command.py" <<< 7554 1726853197.01108: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpk1qsn4s5" to remote "/root/.ansible/tmp/ansible-tmp-1726853196.957633-9442-170988189083942/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853196.957633-9442-170988189083942/AnsiballZ_command.py" <<< 7554 1726853197.01715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853197.01756: stderr chunk (state=3): >>><<< 7554 1726853197.01759: stdout chunk (state=3): >>><<< 7554 1726853197.01785: done transferring module to remote 7554 1726853197.01794: _low_level_execute_command(): starting 7554 1726853197.01798: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853196.957633-9442-170988189083942/ /root/.ansible/tmp/ansible-tmp-1726853196.957633-9442-170988189083942/AnsiballZ_command.py && sleep 0' 7554 1726853197.02239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853197.02242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853197.02248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853197.02250: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853197.02252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853197.02298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853197.02301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853197.02365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853197.04197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853197.04222: stderr chunk (state=3): >>><<< 7554 1726853197.04225: stdout chunk (state=3): >>><<< 7554 1726853197.04246: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853197.04250: _low_level_execute_command(): starting 7554 1726853197.04252: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853196.957633-9442-170988189083942/AnsiballZ_command.py && sleep 0' 7554 1726853197.04682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853197.04685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853197.04687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853197.04689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853197.04691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853197.04741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853197.04749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853197.04751: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853197.04814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853197.21690: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 13:26:37.204227", "end": "2024-09-20 13:26:37.215048", "delta": "0:00:00.010821", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853197.24041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853197.24068: stderr chunk (state=3): >>><<< 7554 1726853197.24073: stdout chunk (state=3): >>><<< 7554 1726853197.24088: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 13:26:37.204227", "end": "2024-09-20 13:26:37.215048", "delta": "0:00:00.010821", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853197.24116: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853196.957633-9442-170988189083942/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853197.24124: _low_level_execute_command(): starting 7554 1726853197.24129: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853196.957633-9442-170988189083942/ > /dev/null 2>&1 && sleep 0' 7554 1726853197.24598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853197.24602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853197.24604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853197.24607: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853197.24610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853197.24660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853197.24664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853197.24666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853197.24732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853197.26602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853197.26620: stderr chunk (state=3): >>><<< 7554 1726853197.26624: stdout chunk (state=3): >>><<< 7554 1726853197.26636: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853197.26642: handler run complete 7554 1726853197.26662: Evaluated conditional (False): False 7554 1726853197.26672: attempt loop complete, returning result 7554 1726853197.26675: _execute() done 7554 1726853197.26677: dumping result to json 7554 1726853197.26681: done dumping result, returning 7554 1726853197.26693: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [02083763-bbaf-bdc3-98b6-000000001a75] 7554 1726853197.26698: sending task result for task 02083763-bbaf-bdc3-98b6-000000001a75 7554 1726853197.26795: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001a75 7554 1726853197.26798: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.010821", "end": "2024-09-20 13:26:37.215048", "rc": 0, "start": "2024-09-20 13:26:37.204227" } 7554 1726853197.26858: no more pending results, returning what we have 7554 1726853197.26861: results queue empty 7554 1726853197.26862: checking for any_errors_fatal 7554 1726853197.26870: done checking for any_errors_fatal 7554 1726853197.26872: checking for max_fail_percentage 7554 1726853197.26874: done checking for max_fail_percentage 7554 1726853197.26875: checking to see if all hosts have failed and the running result is not ok 7554 1726853197.26876: done checking to see if all hosts have failed 7554 1726853197.26877: getting the remaining hosts for this loop 7554 1726853197.26878: done getting the remaining hosts for this loop 7554 1726853197.26881: getting the next task for host managed_node3 7554 1726853197.26888: done getting next task for host managed_node3 7554 1726853197.26890: ^ task is: TASK: Create dummy interface {{ interface }} 7554 1726853197.26893: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853197.26897: getting variables 7554 1726853197.26899: in VariableManager get_vars() 7554 1726853197.26949: Calling all_inventory to load vars for managed_node3 7554 1726853197.26951: Calling groups_inventory to load vars for managed_node3 7554 1726853197.26954: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853197.26964: Calling all_plugins_play to load vars for managed_node3 7554 1726853197.26967: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853197.26969: Calling groups_plugins_play to load vars for managed_node3 7554 1726853197.28019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853197.29532: done with get_vars() 7554 1726853197.29563: done getting variables 7554 1726853197.29624: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853197.29737: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 13:26:37 -0400 (0:00:00.382) 0:00:51.265 ****** 7554 1726853197.29769: entering _queue_task() for managed_node3/command 7554 1726853197.30111: worker is 1 (out of 1 available) 7554 1726853197.30124: exiting _queue_task() for managed_node3/command 7554 1726853197.30137: done queuing things up, now waiting for results queue to drain 7554 1726853197.30139: waiting for pending results... 7554 1726853197.30499: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7554 1726853197.30556: in run() - task 02083763-bbaf-bdc3-98b6-000000001a76 7554 1726853197.30581: variable 'ansible_search_path' from source: unknown 7554 1726853197.30591: variable 'ansible_search_path' from source: unknown 7554 1726853197.30777: calling self._execute() 7554 1726853197.30781: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853197.30783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853197.30787: variable 'omit' from source: magic vars 7554 1726853197.31127: variable 'ansible_distribution_major_version' from source: facts 7554 1726853197.31144: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853197.31357: variable 'type' from source: play vars 7554 1726853197.31369: variable 'state' from source: include params 7554 1726853197.31381: variable 'interface' from source: play vars 7554 1726853197.31390: variable 'current_interfaces' from source: set_fact 7554 1726853197.31403: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7554 1726853197.31410: when evaluation is False, skipping this task 7554 1726853197.31416: _execute() done 7554 1726853197.31423: dumping result to json 7554 1726853197.31430: done dumping result, returning 7554 1726853197.31446: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [02083763-bbaf-bdc3-98b6-000000001a76] 7554 1726853197.31456: sending task result for task 02083763-bbaf-bdc3-98b6-000000001a76 7554 1726853197.31677: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001a76 7554 1726853197.31680: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853197.31729: no more pending results, returning what we have 7554 1726853197.31733: results queue empty 7554 1726853197.31734: checking for any_errors_fatal 7554 1726853197.31744: done checking for any_errors_fatal 7554 1726853197.31745: checking for max_fail_percentage 7554 1726853197.31747: done checking for max_fail_percentage 7554 1726853197.31748: checking to see if all hosts have failed and the running result is not ok 7554 1726853197.31749: done checking to see if all hosts have failed 7554 1726853197.31750: getting the remaining hosts for this loop 7554 1726853197.31751: done getting the remaining hosts for this loop 7554 1726853197.31755: getting the next task for host managed_node3 7554 1726853197.31762: done getting next task for host managed_node3 7554 1726853197.31765: ^ task is: TASK: Delete dummy interface {{ interface }} 7554 1726853197.31769: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853197.31776: getting variables 7554 1726853197.31778: in VariableManager get_vars() 7554 1726853197.31833: Calling all_inventory to load vars for managed_node3 7554 1726853197.31836: Calling groups_inventory to load vars for managed_node3 7554 1726853197.31838: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853197.31850: Calling all_plugins_play to load vars for managed_node3 7554 1726853197.31853: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853197.31856: Calling groups_plugins_play to load vars for managed_node3 7554 1726853197.33460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853197.34929: done with get_vars() 7554 1726853197.34956: done getting variables 7554 1726853197.35012: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853197.35117: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 13:26:37 -0400 (0:00:00.053) 0:00:51.319 ****** 7554 1726853197.35146: entering _queue_task() for managed_node3/command 7554 1726853197.35486: worker is 1 (out of 1 available) 7554 1726853197.35499: exiting _queue_task() for managed_node3/command 7554 1726853197.35512: done queuing things up, now waiting for results queue to drain 7554 1726853197.35514: waiting for pending results... 7554 1726853197.35892: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7554 1726853197.35921: in run() - task 02083763-bbaf-bdc3-98b6-000000001a77 7554 1726853197.35943: variable 'ansible_search_path' from source: unknown 7554 1726853197.35952: variable 'ansible_search_path' from source: unknown 7554 1726853197.35998: calling self._execute() 7554 1726853197.36112: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853197.36126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853197.36141: variable 'omit' from source: magic vars 7554 1726853197.36506: variable 'ansible_distribution_major_version' from source: facts 7554 1726853197.36523: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853197.36725: variable 'type' from source: play vars 7554 1726853197.36736: variable 'state' from source: include params 7554 1726853197.36749: variable 'interface' from source: play vars 7554 1726853197.36758: variable 'current_interfaces' from source: set_fact 7554 1726853197.36773: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7554 1726853197.36781: when evaluation is False, skipping this task 7554 1726853197.36789: _execute() done 7554 1726853197.36797: dumping result to json 7554 1726853197.36804: done dumping result, returning 7554 1726853197.36814: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [02083763-bbaf-bdc3-98b6-000000001a77] 7554 1726853197.36825: sending task result for task 02083763-bbaf-bdc3-98b6-000000001a77 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853197.37011: no more pending results, returning what we have 7554 1726853197.37016: results queue empty 7554 1726853197.37017: checking for any_errors_fatal 7554 1726853197.37024: done checking for any_errors_fatal 7554 1726853197.37025: checking for max_fail_percentage 7554 1726853197.37026: done checking for max_fail_percentage 7554 1726853197.37027: checking to see if all hosts have failed and the running result is not ok 7554 1726853197.37029: done checking to see if all hosts have failed 7554 1726853197.37029: getting the remaining hosts for this loop 7554 1726853197.37031: done getting the remaining hosts for this loop 7554 1726853197.37034: getting the next task for host managed_node3 7554 1726853197.37041: done getting next task for host managed_node3 7554 1726853197.37044: ^ task is: TASK: Create tap interface {{ interface }} 7554 1726853197.37048: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853197.37053: getting variables 7554 1726853197.37055: in VariableManager get_vars() 7554 1726853197.37111: Calling all_inventory to load vars for managed_node3 7554 1726853197.37114: Calling groups_inventory to load vars for managed_node3 7554 1726853197.37117: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853197.37131: Calling all_plugins_play to load vars for managed_node3 7554 1726853197.37135: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853197.37139: Calling groups_plugins_play to load vars for managed_node3 7554 1726853197.37884: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001a77 7554 1726853197.37888: WORKER PROCESS EXITING 7554 1726853197.38468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853197.44272: done with get_vars() 7554 1726853197.44297: done getting variables 7554 1726853197.44331: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853197.44409: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 13:26:37 -0400 (0:00:00.092) 0:00:51.411 ****** 7554 1726853197.44428: entering _queue_task() for managed_node3/command 7554 1726853197.44692: worker is 1 (out of 1 available) 7554 1726853197.44706: exiting _queue_task() for managed_node3/command 7554 1726853197.44717: done queuing things up, now waiting for results queue to drain 7554 1726853197.44719: waiting for pending results... 7554 1726853197.44913: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7554 1726853197.45004: in run() - task 02083763-bbaf-bdc3-98b6-000000001a78 7554 1726853197.45015: variable 'ansible_search_path' from source: unknown 7554 1726853197.45019: variable 'ansible_search_path' from source: unknown 7554 1726853197.45052: calling self._execute() 7554 1726853197.45133: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853197.45138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853197.45147: variable 'omit' from source: magic vars 7554 1726853197.45435: variable 'ansible_distribution_major_version' from source: facts 7554 1726853197.45445: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853197.45588: variable 'type' from source: play vars 7554 1726853197.45591: variable 'state' from source: include params 7554 1726853197.45595: variable 'interface' from source: play vars 7554 1726853197.45600: variable 'current_interfaces' from source: set_fact 7554 1726853197.45610: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7554 1726853197.45613: when evaluation is False, skipping this task 7554 1726853197.45616: _execute() done 7554 1726853197.45618: dumping result to json 7554 1726853197.45620: done dumping result, returning 7554 1726853197.45622: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [02083763-bbaf-bdc3-98b6-000000001a78] 7554 1726853197.45629: sending task result for task 02083763-bbaf-bdc3-98b6-000000001a78 7554 1726853197.45723: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001a78 7554 1726853197.45725: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853197.45777: no more pending results, returning what we have 7554 1726853197.45780: results queue empty 7554 1726853197.45781: checking for any_errors_fatal 7554 1726853197.45787: done checking for any_errors_fatal 7554 1726853197.45788: checking for max_fail_percentage 7554 1726853197.45789: done checking for max_fail_percentage 7554 1726853197.45790: checking to see if all hosts have failed and the running result is not ok 7554 1726853197.45791: done checking to see if all hosts have failed 7554 1726853197.45791: getting the remaining hosts for this loop 7554 1726853197.45793: done getting the remaining hosts for this loop 7554 1726853197.45797: getting the next task for host managed_node3 7554 1726853197.45802: done getting next task for host managed_node3 7554 1726853197.45804: ^ task is: TASK: Delete tap interface {{ interface }} 7554 1726853197.45807: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853197.45811: getting variables 7554 1726853197.45813: in VariableManager get_vars() 7554 1726853197.45864: Calling all_inventory to load vars for managed_node3 7554 1726853197.45866: Calling groups_inventory to load vars for managed_node3 7554 1726853197.45869: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853197.45882: Calling all_plugins_play to load vars for managed_node3 7554 1726853197.45885: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853197.45889: Calling groups_plugins_play to load vars for managed_node3 7554 1726853197.47159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853197.48380: done with get_vars() 7554 1726853197.48398: done getting variables 7554 1726853197.48440: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7554 1726853197.48524: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 13:26:37 -0400 (0:00:00.041) 0:00:51.453 ****** 7554 1726853197.48548: entering _queue_task() for managed_node3/command 7554 1726853197.48804: worker is 1 (out of 1 available) 7554 1726853197.48818: exiting _queue_task() for managed_node3/command 7554 1726853197.48831: done queuing things up, now waiting for results queue to drain 7554 1726853197.48833: waiting for pending results... 7554 1726853197.49024: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7554 1726853197.49106: in run() - task 02083763-bbaf-bdc3-98b6-000000001a79 7554 1726853197.49117: variable 'ansible_search_path' from source: unknown 7554 1726853197.49121: variable 'ansible_search_path' from source: unknown 7554 1726853197.49152: calling self._execute() 7554 1726853197.49239: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853197.49243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853197.49255: variable 'omit' from source: magic vars 7554 1726853197.49537: variable 'ansible_distribution_major_version' from source: facts 7554 1726853197.49549: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853197.49834: variable 'type' from source: play vars 7554 1726853197.49837: variable 'state' from source: include params 7554 1726853197.49839: variable 'interface' from source: play vars 7554 1726853197.49841: variable 'current_interfaces' from source: set_fact 7554 1726853197.49843: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7554 1726853197.49845: when evaluation is False, skipping this task 7554 1726853197.49846: _execute() done 7554 1726853197.49848: dumping result to json 7554 1726853197.49850: done dumping result, returning 7554 1726853197.49851: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [02083763-bbaf-bdc3-98b6-000000001a79] 7554 1726853197.49853: sending task result for task 02083763-bbaf-bdc3-98b6-000000001a79 7554 1726853197.49912: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001a79 7554 1726853197.49914: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7554 1726853197.49961: no more pending results, returning what we have 7554 1726853197.49964: results queue empty 7554 1726853197.49965: checking for any_errors_fatal 7554 1726853197.49972: done checking for any_errors_fatal 7554 1726853197.49973: checking for max_fail_percentage 7554 1726853197.49975: done checking for max_fail_percentage 7554 1726853197.49976: checking to see if all hosts have failed and the running result is not ok 7554 1726853197.49978: done checking to see if all hosts have failed 7554 1726853197.49978: getting the remaining hosts for this loop 7554 1726853197.49980: done getting the remaining hosts for this loop 7554 1726853197.49984: getting the next task for host managed_node3 7554 1726853197.49993: done getting next task for host managed_node3 7554 1726853197.49996: ^ task is: TASK: Verify network state restored to default 7554 1726853197.49999: ^ state is: HOST STATE: block=2, task=41, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853197.50004: getting variables 7554 1726853197.50006: in VariableManager get_vars() 7554 1726853197.50058: Calling all_inventory to load vars for managed_node3 7554 1726853197.50061: Calling groups_inventory to load vars for managed_node3 7554 1726853197.50064: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853197.50092: Calling all_plugins_play to load vars for managed_node3 7554 1726853197.50096: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853197.50100: Calling groups_plugins_play to load vars for managed_node3 7554 1726853197.51370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853197.52218: done with get_vars() 7554 1726853197.52235: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:149 Friday 20 September 2024 13:26:37 -0400 (0:00:00.037) 0:00:51.490 ****** 7554 1726853197.52304: entering _queue_task() for managed_node3/include_tasks 7554 1726853197.52551: worker is 1 (out of 1 available) 7554 1726853197.52565: exiting _queue_task() for managed_node3/include_tasks 7554 1726853197.52579: done queuing things up, now waiting for results queue to drain 7554 1726853197.52581: waiting for pending results... 7554 1726853197.52765: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 7554 1726853197.52842: in run() - task 02083763-bbaf-bdc3-98b6-000000000151 7554 1726853197.52857: variable 'ansible_search_path' from source: unknown 7554 1726853197.52888: calling self._execute() 7554 1726853197.52972: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853197.52979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853197.52988: variable 'omit' from source: magic vars 7554 1726853197.53276: variable 'ansible_distribution_major_version' from source: facts 7554 1726853197.53285: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853197.53291: _execute() done 7554 1726853197.53294: dumping result to json 7554 1726853197.53297: done dumping result, returning 7554 1726853197.53303: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [02083763-bbaf-bdc3-98b6-000000000151] 7554 1726853197.53309: sending task result for task 02083763-bbaf-bdc3-98b6-000000000151 7554 1726853197.53396: done sending task result for task 02083763-bbaf-bdc3-98b6-000000000151 7554 1726853197.53399: WORKER PROCESS EXITING 7554 1726853197.53428: no more pending results, returning what we have 7554 1726853197.53432: in VariableManager get_vars() 7554 1726853197.53488: Calling all_inventory to load vars for managed_node3 7554 1726853197.53491: Calling groups_inventory to load vars for managed_node3 7554 1726853197.53493: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853197.53505: Calling all_plugins_play to load vars for managed_node3 7554 1726853197.53509: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853197.53512: Calling groups_plugins_play to load vars for managed_node3 7554 1726853197.54289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853197.55156: done with get_vars() 7554 1726853197.55175: variable 'ansible_search_path' from source: unknown 7554 1726853197.55193: we have included files to process 7554 1726853197.55194: generating all_blocks data 7554 1726853197.55195: done generating all_blocks data 7554 1726853197.55201: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 7554 1726853197.55203: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 7554 1726853197.55204: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 7554 1726853197.55476: done processing included file 7554 1726853197.55478: iterating over new_blocks loaded from include file 7554 1726853197.55479: in VariableManager get_vars() 7554 1726853197.55496: done with get_vars() 7554 1726853197.55497: filtering new block on tags 7554 1726853197.55509: done filtering new block on tags 7554 1726853197.55511: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 7554 1726853197.55515: extending task lists for all hosts with included blocks 7554 1726853197.58929: done extending task lists 7554 1726853197.58931: done processing included files 7554 1726853197.58932: results queue empty 7554 1726853197.58932: checking for any_errors_fatal 7554 1726853197.58935: done checking for any_errors_fatal 7554 1726853197.58935: checking for max_fail_percentage 7554 1726853197.58936: done checking for max_fail_percentage 7554 1726853197.58936: checking to see if all hosts have failed and the running result is not ok 7554 1726853197.58937: done checking to see if all hosts have failed 7554 1726853197.58938: getting the remaining hosts for this loop 7554 1726853197.58939: done getting the remaining hosts for this loop 7554 1726853197.58940: getting the next task for host managed_node3 7554 1726853197.58943: done getting next task for host managed_node3 7554 1726853197.58945: ^ task is: TASK: Check routes and DNS 7554 1726853197.58946: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853197.58948: getting variables 7554 1726853197.58949: in VariableManager get_vars() 7554 1726853197.58965: Calling all_inventory to load vars for managed_node3 7554 1726853197.58967: Calling groups_inventory to load vars for managed_node3 7554 1726853197.58968: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853197.58976: Calling all_plugins_play to load vars for managed_node3 7554 1726853197.58977: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853197.58979: Calling groups_plugins_play to load vars for managed_node3 7554 1726853197.59640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853197.60491: done with get_vars() 7554 1726853197.60510: done getting variables 7554 1726853197.60543: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 13:26:37 -0400 (0:00:00.082) 0:00:51.573 ****** 7554 1726853197.60565: entering _queue_task() for managed_node3/shell 7554 1726853197.60841: worker is 1 (out of 1 available) 7554 1726853197.60855: exiting _queue_task() for managed_node3/shell 7554 1726853197.60868: done queuing things up, now waiting for results queue to drain 7554 1726853197.60870: waiting for pending results... 7554 1726853197.61061: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 7554 1726853197.61137: in run() - task 02083763-bbaf-bdc3-98b6-000000001d93 7554 1726853197.61152: variable 'ansible_search_path' from source: unknown 7554 1726853197.61155: variable 'ansible_search_path' from source: unknown 7554 1726853197.61184: calling self._execute() 7554 1726853197.61268: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853197.61273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853197.61281: variable 'omit' from source: magic vars 7554 1726853197.61567: variable 'ansible_distribution_major_version' from source: facts 7554 1726853197.61579: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853197.61586: variable 'omit' from source: magic vars 7554 1726853197.61613: variable 'omit' from source: magic vars 7554 1726853197.61639: variable 'omit' from source: magic vars 7554 1726853197.61675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853197.61702: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853197.61720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853197.61733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853197.61743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853197.61774: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853197.61777: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853197.61781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853197.61852: Set connection var ansible_shell_executable to /bin/sh 7554 1726853197.61860: Set connection var ansible_pipelining to False 7554 1726853197.61863: Set connection var ansible_shell_type to sh 7554 1726853197.61865: Set connection var ansible_connection to ssh 7554 1726853197.61876: Set connection var ansible_timeout to 10 7554 1726853197.61879: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853197.61898: variable 'ansible_shell_executable' from source: unknown 7554 1726853197.61901: variable 'ansible_connection' from source: unknown 7554 1726853197.61904: variable 'ansible_module_compression' from source: unknown 7554 1726853197.61906: variable 'ansible_shell_type' from source: unknown 7554 1726853197.61909: variable 'ansible_shell_executable' from source: unknown 7554 1726853197.61911: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853197.61913: variable 'ansible_pipelining' from source: unknown 7554 1726853197.61915: variable 'ansible_timeout' from source: unknown 7554 1726853197.61920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853197.62026: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853197.62035: variable 'omit' from source: magic vars 7554 1726853197.62039: starting attempt loop 7554 1726853197.62042: running the handler 7554 1726853197.62055: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853197.62070: _low_level_execute_command(): starting 7554 1726853197.62078: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853197.62605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853197.62609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7554 1726853197.62613: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853197.62616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853197.62666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853197.62669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853197.62674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853197.62744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853197.64455: stdout chunk (state=3): >>>/root <<< 7554 1726853197.64549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853197.64584: stderr chunk (state=3): >>><<< 7554 1726853197.64587: stdout chunk (state=3): >>><<< 7554 1726853197.64612: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853197.64625: _low_level_execute_command(): starting 7554 1726853197.64631: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853197.6461203-9463-253169982542365 `" && echo ansible-tmp-1726853197.6461203-9463-253169982542365="` echo /root/.ansible/tmp/ansible-tmp-1726853197.6461203-9463-253169982542365 `" ) && sleep 0' 7554 1726853197.65107: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853197.65118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853197.65122: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853197.65124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853197.65168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853197.65173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853197.65180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853197.65240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853197.67166: stdout chunk (state=3): >>>ansible-tmp-1726853197.6461203-9463-253169982542365=/root/.ansible/tmp/ansible-tmp-1726853197.6461203-9463-253169982542365 <<< 7554 1726853197.67264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853197.67295: stderr chunk (state=3): >>><<< 7554 1726853197.67298: stdout chunk (state=3): >>><<< 7554 1726853197.67315: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853197.6461203-9463-253169982542365=/root/.ansible/tmp/ansible-tmp-1726853197.6461203-9463-253169982542365 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853197.67342: variable 'ansible_module_compression' from source: unknown 7554 1726853197.67391: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853197.67423: variable 'ansible_facts' from source: unknown 7554 1726853197.67485: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853197.6461203-9463-253169982542365/AnsiballZ_command.py 7554 1726853197.67592: Sending initial data 7554 1726853197.67596: Sent initial data (154 bytes) 7554 1726853197.68040: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853197.68080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853197.68083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853197.68087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853197.68089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 7554 1726853197.68092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853197.68140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853197.68146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853197.68149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853197.68205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853197.69802: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853197.69867: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853197.69932: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpjgicmqo2 /root/.ansible/tmp/ansible-tmp-1726853197.6461203-9463-253169982542365/AnsiballZ_command.py <<< 7554 1726853197.69950: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853197.6461203-9463-253169982542365/AnsiballZ_command.py" <<< 7554 1726853197.69992: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 7554 1726853197.70044: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmpjgicmqo2" to remote "/root/.ansible/tmp/ansible-tmp-1726853197.6461203-9463-253169982542365/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853197.6461203-9463-253169982542365/AnsiballZ_command.py" <<< 7554 1726853197.70703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853197.70750: stderr chunk (state=3): >>><<< 7554 1726853197.70754: stdout chunk (state=3): >>><<< 7554 1726853197.70787: done transferring module to remote 7554 1726853197.70796: _low_level_execute_command(): starting 7554 1726853197.70800: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853197.6461203-9463-253169982542365/ /root/.ansible/tmp/ansible-tmp-1726853197.6461203-9463-253169982542365/AnsiballZ_command.py && sleep 0' 7554 1726853197.71251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853197.71256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 7554 1726853197.71263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853197.71266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853197.71268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853197.71315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853197.71318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853197.71320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853197.71387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853197.73537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853197.73542: stdout chunk (state=3): >>><<< 7554 1726853197.73545: stderr chunk (state=3): >>><<< 7554 1726853197.73547: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853197.73550: _low_level_execute_command(): starting 7554 1726853197.73552: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853197.6461203-9463-253169982542365/AnsiballZ_command.py && sleep 0' 7554 1726853197.74648: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853197.74881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853197.74996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853197.75017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853197.75124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853197.91587: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:2a:53:36:f0:e9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3282sec preferred_lft 3282sec\n inet6 fe80::102a:53ff:fe36:f0e9/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:26:37.905461", "end": "2024-09-20 13:26:37.914411", "delta": "0:00:00.008950", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 7554 1726853197.91616: stdout chunk (state=3): >>> <<< 7554 1726853197.93245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853197.93306: stderr chunk (state=3): >>><<< 7554 1726853197.93316: stdout chunk (state=3): >>><<< 7554 1726853197.93347: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:2a:53:36:f0:e9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3282sec preferred_lft 3282sec\n inet6 fe80::102a:53ff:fe36:f0e9/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:26:37.905461", "end": "2024-09-20 13:26:37.914411", "delta": "0:00:00.008950", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853197.93404: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853197.6461203-9463-253169982542365/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853197.93417: _low_level_execute_command(): starting 7554 1726853197.93427: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853197.6461203-9463-253169982542365/ > /dev/null 2>&1 && sleep 0' 7554 1726853197.94081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853197.94101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853197.94189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853197.94231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853197.94248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853197.94272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853197.94363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853197.96280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853197.96301: stdout chunk (state=3): >>><<< 7554 1726853197.96315: stderr chunk (state=3): >>><<< 7554 1726853197.96477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853197.96481: handler run complete 7554 1726853197.96483: Evaluated conditional (False): False 7554 1726853197.96485: attempt loop complete, returning result 7554 1726853197.96487: _execute() done 7554 1726853197.96489: dumping result to json 7554 1726853197.96491: done dumping result, returning 7554 1726853197.96493: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [02083763-bbaf-bdc3-98b6-000000001d93] 7554 1726853197.96495: sending task result for task 02083763-bbaf-bdc3-98b6-000000001d93 ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008950", "end": "2024-09-20 13:26:37.914411", "rc": 0, "start": "2024-09-20 13:26:37.905461" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:2a:53:36:f0:e9 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3282sec preferred_lft 3282sec inet6 fe80::102a:53ff:fe36:f0e9/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 7554 1726853197.96656: no more pending results, returning what we have 7554 1726853197.96660: results queue empty 7554 1726853197.96661: checking for any_errors_fatal 7554 1726853197.96662: done checking for any_errors_fatal 7554 1726853197.96663: checking for max_fail_percentage 7554 1726853197.96665: done checking for max_fail_percentage 7554 1726853197.96666: checking to see if all hosts have failed and the running result is not ok 7554 1726853197.96667: done checking to see if all hosts have failed 7554 1726853197.96668: getting the remaining hosts for this loop 7554 1726853197.96669: done getting the remaining hosts for this loop 7554 1726853197.96675: getting the next task for host managed_node3 7554 1726853197.96878: done getting next task for host managed_node3 7554 1726853197.96881: ^ task is: TASK: Verify DNS and network connectivity 7554 1726853197.96884: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853197.96889: getting variables 7554 1726853197.96890: in VariableManager get_vars() 7554 1726853197.96940: Calling all_inventory to load vars for managed_node3 7554 1726853197.96943: Calling groups_inventory to load vars for managed_node3 7554 1726853197.96945: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853197.96968: Calling all_plugins_play to load vars for managed_node3 7554 1726853197.96974: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853197.96977: Calling groups_plugins_play to load vars for managed_node3 7554 1726853197.97574: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001d93 7554 1726853197.97578: WORKER PROCESS EXITING 7554 1726853197.98710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853198.00579: done with get_vars() 7554 1726853198.00614: done getting variables 7554 1726853198.00676: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 13:26:38 -0400 (0:00:00.401) 0:00:51.974 ****** 7554 1726853198.00712: entering _queue_task() for managed_node3/shell 7554 1726853198.01175: worker is 1 (out of 1 available) 7554 1726853198.01185: exiting _queue_task() for managed_node3/shell 7554 1726853198.01195: done queuing things up, now waiting for results queue to drain 7554 1726853198.01197: waiting for pending results... 7554 1726853198.01411: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 7554 1726853198.01533: in run() - task 02083763-bbaf-bdc3-98b6-000000001d94 7554 1726853198.01554: variable 'ansible_search_path' from source: unknown 7554 1726853198.01562: variable 'ansible_search_path' from source: unknown 7554 1726853198.01606: calling self._execute() 7554 1726853198.01714: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853198.01726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853198.01738: variable 'omit' from source: magic vars 7554 1726853198.02141: variable 'ansible_distribution_major_version' from source: facts 7554 1726853198.02160: Evaluated conditional (ansible_distribution_major_version != '6'): True 7554 1726853198.02315: variable 'ansible_facts' from source: unknown 7554 1726853198.03135: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 7554 1726853198.03147: variable 'omit' from source: magic vars 7554 1726853198.03204: variable 'omit' from source: magic vars 7554 1726853198.03243: variable 'omit' from source: magic vars 7554 1726853198.03291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7554 1726853198.03340: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7554 1726853198.03368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7554 1726853198.03393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853198.03419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7554 1726853198.03456: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7554 1726853198.03466: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853198.03476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853198.03593: Set connection var ansible_shell_executable to /bin/sh 7554 1726853198.03608: Set connection var ansible_pipelining to False 7554 1726853198.03616: Set connection var ansible_shell_type to sh 7554 1726853198.03623: Set connection var ansible_connection to ssh 7554 1726853198.03647: Set connection var ansible_timeout to 10 7554 1726853198.03658: Set connection var ansible_module_compression to ZIP_DEFLATED 7554 1726853198.03688: variable 'ansible_shell_executable' from source: unknown 7554 1726853198.03696: variable 'ansible_connection' from source: unknown 7554 1726853198.03704: variable 'ansible_module_compression' from source: unknown 7554 1726853198.03711: variable 'ansible_shell_type' from source: unknown 7554 1726853198.03741: variable 'ansible_shell_executable' from source: unknown 7554 1726853198.03744: variable 'ansible_host' from source: host vars for 'managed_node3' 7554 1726853198.03746: variable 'ansible_pipelining' from source: unknown 7554 1726853198.03749: variable 'ansible_timeout' from source: unknown 7554 1726853198.03752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7554 1726853198.03906: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853198.03959: variable 'omit' from source: magic vars 7554 1726853198.03962: starting attempt loop 7554 1726853198.03965: running the handler 7554 1726853198.03968: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7554 1726853198.03983: _low_level_execute_command(): starting 7554 1726853198.03995: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7554 1726853198.04792: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853198.04870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853198.04895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853198.04999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853198.06729: stdout chunk (state=3): >>>/root <<< 7554 1726853198.06882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853198.06885: stdout chunk (state=3): >>><<< 7554 1726853198.06888: stderr chunk (state=3): >>><<< 7554 1726853198.06909: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853198.06929: _low_level_execute_command(): starting 7554 1726853198.06940: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853198.0691683-9489-173359628113902 `" && echo ansible-tmp-1726853198.0691683-9489-173359628113902="` echo /root/.ansible/tmp/ansible-tmp-1726853198.0691683-9489-173359628113902 `" ) && sleep 0' 7554 1726853198.07656: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853198.07764: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853198.07791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853198.07898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853198.09833: stdout chunk (state=3): >>>ansible-tmp-1726853198.0691683-9489-173359628113902=/root/.ansible/tmp/ansible-tmp-1726853198.0691683-9489-173359628113902 <<< 7554 1726853198.09998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853198.10002: stdout chunk (state=3): >>><<< 7554 1726853198.10005: stderr chunk (state=3): >>><<< 7554 1726853198.10026: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853198.0691683-9489-173359628113902=/root/.ansible/tmp/ansible-tmp-1726853198.0691683-9489-173359628113902 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853198.10177: variable 'ansible_module_compression' from source: unknown 7554 1726853198.10180: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7554rfdzaxsk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7554 1726853198.10186: variable 'ansible_facts' from source: unknown 7554 1726853198.10281: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853198.0691683-9489-173359628113902/AnsiballZ_command.py 7554 1726853198.10501: Sending initial data 7554 1726853198.10504: Sent initial data (154 bytes) 7554 1726853198.11152: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853198.11190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853198.11284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853198.11305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853198.11322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853198.11342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853198.11510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853198.13124: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7554 1726853198.13142: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7554 1726853198.13232: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7554 1726853198.13303: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp6phsokpf /root/.ansible/tmp/ansible-tmp-1726853198.0691683-9489-173359628113902/AnsiballZ_command.py <<< 7554 1726853198.13306: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853198.0691683-9489-173359628113902/AnsiballZ_command.py" <<< 7554 1726853198.13355: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7554rfdzaxsk/tmp6phsokpf" to remote "/root/.ansible/tmp/ansible-tmp-1726853198.0691683-9489-173359628113902/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853198.0691683-9489-173359628113902/AnsiballZ_command.py" <<< 7554 1726853198.14241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853198.14392: stderr chunk (state=3): >>><<< 7554 1726853198.14395: stdout chunk (state=3): >>><<< 7554 1726853198.14397: done transferring module to remote 7554 1726853198.14399: _low_level_execute_command(): starting 7554 1726853198.14402: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853198.0691683-9489-173359628113902/ /root/.ansible/tmp/ansible-tmp-1726853198.0691683-9489-173359628113902/AnsiballZ_command.py && sleep 0' 7554 1726853198.15977: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853198.15993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853198.16151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853198.16395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853198.16454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853198.18416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853198.18419: stdout chunk (state=3): >>><<< 7554 1726853198.18422: stderr chunk (state=3): >>><<< 7554 1726853198.18424: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853198.18426: _low_level_execute_command(): starting 7554 1726853198.18429: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853198.0691683-9489-173359628113902/AnsiballZ_command.py && sleep 0' 7554 1726853198.19697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853198.19701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853198.19713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853198.19770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853198.19782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853198.19959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853198.20232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853198.79730: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1789 0 --:--:-- --:--:-- --:--:-- 1794\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1450 0 --:--:-- --:--:-- --:--:-- 1455", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:26:38.357241", "end": "2024-09-20 13:26:38.795602", "delta": "0:00:00.438361", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7554 1726853198.81429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 7554 1726853198.81433: stdout chunk (state=3): >>><<< 7554 1726853198.81438: stderr chunk (state=3): >>><<< 7554 1726853198.81464: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1789 0 --:--:-- --:--:-- --:--:-- 1794\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1450 0 --:--:-- --:--:-- --:--:-- 1455", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:26:38.357241", "end": "2024-09-20 13:26:38.795602", "delta": "0:00:00.438361", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 7554 1726853198.81778: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853198.0691683-9489-173359628113902/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7554 1726853198.81782: _low_level_execute_command(): starting 7554 1726853198.81785: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853198.0691683-9489-173359628113902/ > /dev/null 2>&1 && sleep 0' 7554 1726853198.82809: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7554 1726853198.82990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853198.83000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853198.83015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853198.83029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853198.83036: stderr chunk (state=3): >>>debug2: match not found <<< 7554 1726853198.83048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853198.83061: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7554 1726853198.83123: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 7554 1726853198.83126: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7554 1726853198.83129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7554 1726853198.83131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7554 1726853198.83133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7554 1726853198.83135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 7554 1726853198.83137: stderr chunk (state=3): >>>debug2: match found <<< 7554 1726853198.83139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7554 1726853198.83199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 7554 1726853198.83292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7554 1726853198.83302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7554 1726853198.83394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7554 1726853198.85477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7554 1726853198.85481: stdout chunk (state=3): >>><<< 7554 1726853198.85483: stderr chunk (state=3): >>><<< 7554 1726853198.85486: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7554 1726853198.85488: handler run complete 7554 1726853198.85491: Evaluated conditional (False): False 7554 1726853198.85493: attempt loop complete, returning result 7554 1726853198.85495: _execute() done 7554 1726853198.85496: dumping result to json 7554 1726853198.85498: done dumping result, returning 7554 1726853198.85500: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [02083763-bbaf-bdc3-98b6-000000001d94] 7554 1726853198.85502: sending task result for task 02083763-bbaf-bdc3-98b6-000000001d94 7554 1726853198.85579: done sending task result for task 02083763-bbaf-bdc3-98b6-000000001d94 7554 1726853198.85583: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.438361", "end": "2024-09-20 13:26:38.795602", "rc": 0, "start": "2024-09-20 13:26:38.357241" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1789 0 --:--:-- --:--:-- --:--:-- 1794 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1450 0 --:--:-- --:--:-- --:--:-- 1455 7554 1726853198.85660: no more pending results, returning what we have 7554 1726853198.85664: results queue empty 7554 1726853198.85665: checking for any_errors_fatal 7554 1726853198.85678: done checking for any_errors_fatal 7554 1726853198.85681: checking for max_fail_percentage 7554 1726853198.85682: done checking for max_fail_percentage 7554 1726853198.85683: checking to see if all hosts have failed and the running result is not ok 7554 1726853198.85684: done checking to see if all hosts have failed 7554 1726853198.85685: getting the remaining hosts for this loop 7554 1726853198.85686: done getting the remaining hosts for this loop 7554 1726853198.85690: getting the next task for host managed_node3 7554 1726853198.85699: done getting next task for host managed_node3 7554 1726853198.85702: ^ task is: TASK: meta (flush_handlers) 7554 1726853198.85704: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853198.85709: getting variables 7554 1726853198.85711: in VariableManager get_vars() 7554 1726853198.85760: Calling all_inventory to load vars for managed_node3 7554 1726853198.85763: Calling groups_inventory to load vars for managed_node3 7554 1726853198.85765: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853198.85982: Calling all_plugins_play to load vars for managed_node3 7554 1726853198.85986: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853198.85990: Calling groups_plugins_play to load vars for managed_node3 7554 1726853198.87394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853198.89170: done with get_vars() 7554 1726853198.89204: done getting variables 7554 1726853198.89285: in VariableManager get_vars() 7554 1726853198.89307: Calling all_inventory to load vars for managed_node3 7554 1726853198.89310: Calling groups_inventory to load vars for managed_node3 7554 1726853198.89312: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853198.89317: Calling all_plugins_play to load vars for managed_node3 7554 1726853198.89319: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853198.89321: Calling groups_plugins_play to load vars for managed_node3 7554 1726853198.90635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853198.92385: done with get_vars() 7554 1726853198.92422: done queuing things up, now waiting for results queue to drain 7554 1726853198.92424: results queue empty 7554 1726853198.92425: checking for any_errors_fatal 7554 1726853198.92429: done checking for any_errors_fatal 7554 1726853198.92429: checking for max_fail_percentage 7554 1726853198.92430: done checking for max_fail_percentage 7554 1726853198.92431: checking to see if all hosts have failed and the running result is not ok 7554 1726853198.92432: done checking to see if all hosts have failed 7554 1726853198.92433: getting the remaining hosts for this loop 7554 1726853198.92434: done getting the remaining hosts for this loop 7554 1726853198.92437: getting the next task for host managed_node3 7554 1726853198.92440: done getting next task for host managed_node3 7554 1726853198.92442: ^ task is: TASK: meta (flush_handlers) 7554 1726853198.92446: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853198.92449: getting variables 7554 1726853198.92450: in VariableManager get_vars() 7554 1726853198.92468: Calling all_inventory to load vars for managed_node3 7554 1726853198.92472: Calling groups_inventory to load vars for managed_node3 7554 1726853198.92474: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853198.92480: Calling all_plugins_play to load vars for managed_node3 7554 1726853198.92482: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853198.92485: Calling groups_plugins_play to load vars for managed_node3 7554 1726853198.93975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853198.95600: done with get_vars() 7554 1726853198.95620: done getting variables 7554 1726853198.95674: in VariableManager get_vars() 7554 1726853198.95694: Calling all_inventory to load vars for managed_node3 7554 1726853198.95696: Calling groups_inventory to load vars for managed_node3 7554 1726853198.95698: Calling all_plugins_inventory to load vars for managed_node3 7554 1726853198.95703: Calling all_plugins_play to load vars for managed_node3 7554 1726853198.95705: Calling groups_plugins_inventory to load vars for managed_node3 7554 1726853198.95708: Calling groups_plugins_play to load vars for managed_node3 7554 1726853198.96826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7554 1726853198.98878: done with get_vars() 7554 1726853198.98918: done queuing things up, now waiting for results queue to drain 7554 1726853198.98921: results queue empty 7554 1726853198.98922: checking for any_errors_fatal 7554 1726853198.98923: done checking for any_errors_fatal 7554 1726853198.98924: checking for max_fail_percentage 7554 1726853198.98925: done checking for max_fail_percentage 7554 1726853198.98926: checking to see if all hosts have failed and the running result is not ok 7554 1726853198.98926: done checking to see if all hosts have failed 7554 1726853198.98927: getting the remaining hosts for this loop 7554 1726853198.98928: done getting the remaining hosts for this loop 7554 1726853198.98931: getting the next task for host managed_node3 7554 1726853198.98934: done getting next task for host managed_node3 7554 1726853198.98935: ^ task is: None 7554 1726853198.98936: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7554 1726853198.98937: done queuing things up, now waiting for results queue to drain 7554 1726853198.98938: results queue empty 7554 1726853198.98939: checking for any_errors_fatal 7554 1726853198.98940: done checking for any_errors_fatal 7554 1726853198.98940: checking for max_fail_percentage 7554 1726853198.98941: done checking for max_fail_percentage 7554 1726853198.98942: checking to see if all hosts have failed and the running result is not ok 7554 1726853198.98942: done checking to see if all hosts have failed 7554 1726853198.98948: getting the next task for host managed_node3 7554 1726853198.98951: done getting next task for host managed_node3 7554 1726853198.98952: ^ task is: None 7554 1726853198.98953: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=128 changed=4 unreachable=0 failed=0 skipped=118 rescued=0 ignored=0 Friday 20 September 2024 13:26:38 -0400 (0:00:00.984) 0:00:52.958 ****** =============================================================================== Install iproute --------------------------------------------------------- 3.32s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which services are running ---- 2.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.92s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.83s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.81s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.65s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:6 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.30s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Create veth interface veth0 --------------------------------------------- 1.16s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Create veth interface veth0 --------------------------------------------- 1.14s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.01s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:3 Verify DNS and network connectivity ------------------------------------- 0.98s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Check which packages are installed --- 0.94s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.93s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.88s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.81s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.79s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather the minimum subset of ansible_facts required by the network role test --- 0.77s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Install iproute --------------------------------------------------------- 0.76s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.76s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.75s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 7554 1726853198.99206: RUNNING CLEANUP